The physics of maiming a child (repost because of “those” scooters)

Dear Driver,

When you backed out of a driveway and did not even see how I swerved around behind your car to avoid T-boning you, how dare you have the temerity to tell me you were careful!  I was 7 feet tall, dressed in bright yellow and traveling at no more than 10 km/h.  Perhaps a simple lesson in physics will help you and your fellow “driveway backers” to realise how dangerous you are and to adopt safer driving practices.

In the diagram you can see a car backing out of a driveway.  Typically when you are at the edge of your property and have a fence (see photo below) blocking your view of the footpath you are able to see about 1.7 metres along the footpath.  Let us imagine that there is a child on a trike riding at 5 km/h just out of your line of sight.  How long  does it take them to travel that 1.67 metres?  The physics is quite easy.

5 km/h is 5000 metres in 60 x 60 seconds, ie about 1.4 m/s.  Putting this in the formula above means that it takes about 1.2 seconds for the child to travel that 1.67 metres. 

Now consider this. According to design guidelines for safe bicycle use 2.5 seconds must be allowed for someone to observe the danger, react, apply brakes and stop.  In other words, if you covered the distance from your driveway to the middle of the footpath, about 1 metre, in under 1.2 seconds you will almost certainly hit the child.  That is a speed of just 3 km/h!!!!!

Now consider who else is on the footpath, all legally:

  • Pedestrians 5 km/h
  • Joggers 5- 15 km/h
  • Kids on skateboards or scooters 10 km/h
  • Child on bicycle with small wheels, 10 km/h
  • Mobility scooter, 5-10 km/h
  • Me on my Trikke, 10 km/h
  • Postie on a bike 5-10 km/h.

For those going 10 km/h your speed needs to be just over 1.5 km/h to hit someone! That’s the legal people … but the Lime scooters at 20 km/h mean it all the more necessary to slow down.

So, before you do some damage here is what you can do:

  • Never back out of a driveway unless you really really must.  If you think you must because of the design of your driveway, change the design!
  • Cut back those hedges, remove some of that fence so that you can see further. [ City councils… please make a by-law to make this happen].
  • Always always always stop at the end of your driveway (BEFORE THE FOOTPATH) and toot a horn.  Then proceed very very slowly.

By the way, you are legally obliged to give way:

(1)
A driver entering or exiting a driveway must give way to a road user on a footpath, cycle path, or shared path (as described by clause 11.1A(1)).

Thank you for considering the physics of maiming a child, may you never find your self in such a terrible situation.

Regards,

Dr John Pickering

A typical driveway with almost non-existant visibility
 A typical driveway with almost non-existent visibility

———

Feature Image: Intangible Arts https://www.flickr.com/photos/intangible/ under Creative Commons Attribution 2.0 licence.

Advertisements

Cheesecake files: A new test to rule out heart attacks in just a few minutes.

Your chest hurts, you go to the hospital (good move), you get rushed through and a nurse takes some blood and measures the electrical activity of your heart.  A doctor asks you some questions.  While she does so, the blood is being tested – the results are back already! Yeah, they are negative and everything else is OK, it’s not a heart attack – you can go home.  This is the likely scenario in the near future thanks to new blood test technology which we, in Christchurch hospital’s Emergency Department, have been fortunate to be the first in the world to trial in patients. The results of our pilot study have now been published ( in a Journal of the American Medical Association (JAMA Cardiology).

About 65,000 patients a year are investigated for heart attacks in New Zealand emergency departments, yet only about 15% of them are actually having a heart attack.  New Zealand leads the world in having become the first country in the world in which all patients are assessed by an accelerated diagnostic pathway that enables rapid evaluation of the patients and can send people home after two blood tests taken two to three hours apart (see here for more).  This means many patients who once-upon-a-time would have been admitted to hospital overnight, are now able to be reassured after 4-6 hours that they are not having a heart attack and can go home.  Nevertheless, there are enormous advantages for both patient and health system to being able to come to the conclusion that the pain isn’t life threatening earlier. The cork in the bottle preventing this happening is the time it takes for a blood sample to be analysed for signs of damage to the heart. These blood tests typically take 1 to 2 hours from the time of sampling (within ~15 minutes of arrival in the ED) until the results are available for the doctor to review.  Because doctors are dealing with multiple patients at a time, their review and decisions around whether to allow the patient to go home, or to be admitted for more investigation, are further delayed.  A point-of-care test is one that happens with a small machine near the bedside and can produce results available to the doctor even while they are still examining the patient.  Until now, though, the precision of these machines has not been good enough to be used in emergency departments.  When one manufacturer told us that their new technology may now have sufficient precision we were keen to test it,  so we, in a first-in-the-world study, undertook a study in patients entering the emergency department of Christchurch hospital whom the attending doctor was investigating for a possible heart attack.

Thanks to the volunteer patients (I love volunteers) who gave some extra blood we measured the troponin concentration by this new point-of-care test (called the next generation point of care troponin I: TnI-Nx). Troponin comes from the heart muscle and is released into the blood during a heart attack. When the troponin concentrations in the blood are very very low we can be confident that the source of the patient’s discomfort is not a heart attack.  Low concentrations require a very precise measurement test. Often, a very low concentration means the patient can safely go home. In 354 volunteers we measured troponin with the TnI-Nx assay when they first came to the emergency department.  Their treatment didn’t change, and all clinical decisions were based on the normal laboratory based troponin (measured on entry to the emergency department and again 2 hours later). From the blood samples we collected and measurements we made, we could work out what could have happened if we had used the TnI-Nx results instead.

In our study the TnI-Nx troponin measurement was as good as, and possibly slightly better, than the laboratory based troponin measurement at ruling-out a heart attack. We found 57% of the patients being investigated had troponin concentrations measured with TnI-Nx below a threshold at which we could be confident that they were not having a heart-attack.  All 57 patients who were actually having a heart attack had higher concentrations.

When implemented our results may mean that instead of waiting 3-6 hours for a results, half of patients being investigated could know within about 30 minutes of arriving at the ED whether they are having a heart attack or not.  This early reassurance would be a relief to many, as well as reducing over-crowding in the emergency department and freeing up staff for other tasks.  But before we implement the new test, we must validate it in more patients – this is a study we are carrying out now.  Validation will enable us to more precisely determine a threshold concentration for TnI-Nx for clinical use which we can, with a very high degree of certainty, safely use to rule-out a heart attack.

The test also should allow people living in rural areas to get just as good care as in emergency departments because it could be deployed in rural hospital and general practices.  This would save many lengthy, worrying, and expensive trips for people to an urban emergency department.

This study was carried out by the Christchurch Emergency Department research group (director and senior author Dr Martin Than) in conjunction with the Christchurch Heart Institute (University of Otago Christchurch).  My colleague, Dr Joanna Young did much of the hard yards, and we thank our clinical research nurses and assistant for all they did to take blood samples, collect data, and lend a hand around the ED.  The manufacturer of the blood test, Abbott Point-of-care, provided the tests free of charge, but they were blinded to the results and all analysis was conducted independent of them.

How we envisage TnI-Nx may be used in the future to allow very early rule out of heart attacks

Please note – patients experiencing sudden onset chest-pain should always seek immediate medical attention.

I am fortunate to hold a Senior Research Fellowship in Acute Care sponsored by the Canterbury Medical Research Foundation, the Emergency Care Foundation, and the Canterbury District Health Board which enables me to participate in these studies.

ps.  You’ll have to read some of my older posts if you want to know why “Cheesecake files”

 

The Treatment of Kidney Failure in New Zealand

I am delighted to introduce a guest post from Dr Kelvin Lynn. Dr Lynn worked as a Nephrologist at Christchurch Hospital for 35 years and retired in 2015.  He is the lead author for a book just published:

The Treatment of Kidney Failure in New Zealand

Authors: Kelvin L Lynn, Adrian L Buttimore, Peter J Hatfield, Martin R Wallace 2018

ISBN PDF – 978-0-473-45293-3

Available at no charge at www.kidneys.co.nz/Kidney-History from 16 October 2018.

Dr Kelvin Lynn and his fellow editors tell the history of the treatment of people with kidney failure in New Zealand; beginning in the early 1950s this story encompasses remarkable experiences of patients and their families, and of the contributions made by dedicated health professionals. It also reveals the challenges and ethics of meeting an ever-increasing demand for treatment.

New Zealand doctors were early adopters of new dialysis technology. The first peritoneal dialysis (PD treatment in New Zealand occurred at Wellington Hospital in 1954. Two young doctors tried a recently reported treatment using homemade equipment – classic Number 8 wire technology. Dr Neil Turnbull was a medical registrar in 1954 when he admitted a pale, vomiting, dehydrated 24-year-old woman who had not passed urine for the past nine days. Fifteen days before admission she had tried to terminate an unwanted pregnancy by infusing a Dettol solution into her cervical canal. In spite of rehydration with blood and five per cent glucose she became comatose. It was then that pathology registrar, Dr Dave Reid, suggested trying PD, which he had recently read about in the New England Journal of Medicine.  After mixing 20 litres of a glucose solution in sterilised glass bottles they had to stop as the solution had caramelised. They supposed the autoclave (steriliser) had been too hot and were proved right when after the autoclave temperature was reduced the new glucose solution remained clear. This was not the end of their technical problems, however, for after running two litres of the solution through the polythene tube that they had inserted into the right iliac fossa with a trocar and cannula, there was no drainage. Undeterred, they pulled the tube out and established good drainage by pricking holes in the tubing with a hot 22-gauge needle. After three days of peritoneal dialysis the patient began passing increasing volumes of urine and then regained consciousness. When last seen by Turnbull in 1992, she had normal renal function.

This book recounts the contribution of doctors, nurses, technicians, and patients and their families to the story of kidney treatment in New Zealand. Social and political changes in our country since the 1950s have critically influenced the development of treatment services for New Zealanders with kidney failure. The improvements in technology and community expectations regarding access to treatment over the past 50 years are discussed as well as the issues for patients and families coming to terms with kidney failure and its treatment.

This story is illustrated with many anecdotes and historical photographs.

  • The experience of living a life with kidney failure is recounted from patient interviews.These stories are a testament to the bravery and determination of these individuals. Rob Brydon’s story demonstrated what ordinary people were able to do in the face of kidney failure.

Rob began home haemodialysis on 31 August 1976 just after getting married. After two failed transplants, the second from his brother Nev, he remains on HHD over 40 years later.  Most of this time, he worked full-time. Following redundancy in 1993, he started his own painting business which he ran for ten years until he had both legs amputated below the knee, bringing this to an end. Rob had a profound anaemia as the result of having both his kidneys removed to control his high blood pressure. He built his own house while his haemoglobin concentration was only 40 to 50 g/L, and subsequently Rob was one of the first patients in New Zealand to benefit from erythropoietin treatment for renal anaemia. Rob remembers the burden of having to reuse dialysers and blood lines and the unpleasantness of using formalin for sterilisation. His advice to other dialysis patients is to “try to keep your life as normal as possible.”

  • There are chapters devoted to the professional development of renal nurses and dialysis technicians who have played a key role in the progress made in kidney treatment. Nurses were important members of the early clinical teams who pioneered dialysis treatment. Now renal nursing is an established nursing specialty. Hospital technicians who maintained the early dialysis equipment quickly took up clinical roles, particularly in training patients for dialysis at home.
  • There is an account of the trends and statistics of dialysis treatment in the past and a chapter discussing where dialysis treatment may go in the future.

The first home dialysis machine used in New Zealand Drake Willock 4011 1972

Enquiries to kidneyhistory@gmail.com

 

PBRF: The end is nigh

I’d like to say the end is nigh for the performance-based research fund (PBRF), full stop. A few months ago, I demonstrated how the expensive and tedious production of evidence portfolios by 7000 academic staff will do nothing to change the redistribution of research funding – the purported reason for PBRF. So, I’d like to say the end is nigh because the minister responsible (Hon. Chris Hipkins) has seen the light and pulled the plug. But, alas, it is simply that all portfolios have now been submitted and so await assessment by the peer review panels . About 250 people serve on these panels, nearly all of whom are Professors, most from New Zealand but a sprinkling from Australia and elsewhere.  They represent the gathering of some of the best minds in the country.  From my perspective it is a terrible waste  of time for them and of tax-payers’ money for the rest of us. 

In completing my portfolio I received a message concerning citation counts that “Panels are not a fan of Google scholar as they think the counts are over-inflated. You can use this but also supply cite counts from either Scopus or WoS.” Frankly, I think the panellists are far too intelligent to worry about this and I expect that they realise that while Google scholar counts are over-inflated, that Scopus (owned by Elsevier!) and WoS under-count (eg by not counting book chapters, leaving out some journals etc).  What matters, if citations have to be used at all, is that apples are compared with apples.  I’ve discussed some of these problems recently.  Before I suggest a solution that doesn’t require 250 Professors sitting in days of meetings, or 7000 academics spending days in completing evidence portfolios, I’ve produced a graphic to illustrate the problem of comparing apples with oranges.  Google scholar ranks journal according to the 5-year h-index. These can be explored according to the various categories and sub-categories Google Scholar uses (here). Visually each of the 8 major categories has different numbers of citations and so of the h-indices.  For example, Social Sciences is a small fraction of Health and Medial Sciences, but is larger than the Humanities, Literature & Arts.   Within each category there are large differences between sub-categories.  For example, in the Health & Medical Sciences category a cardiologist publishing in cardiology journals will be publishing in journals where the top 20 h-indices range from 176 to 56.   However, the Nursing academic will be publishing in journals whose top 20 h-indices range from 59 to 31.  So what is needed is a system that takes into account where the academic is publishing.

Visualisation of Google Scholar’s h-5 index Categories (large ellipses at the bottom) and sub-categories (smaller ellipses). Each sub-category ellipse represents in height and area the sum of the h-indices for 20 journals within that sub-category.

Google Scholar, which, unlike WoS and Scopus, is open and public, can be scraped by just three lines of code in R (a free and open programming language) to extract the last 6 years of published article and their citations for any academic with a profile on Google Scholar.  Thousands of NZ academics already have one.  Here’s the code which extracts my last 6 years of data:

library(scholar)
library(dplyr)
pubs<-get_publications("Ig74otYAAAAJ&hl") %>% filter(year>=2012 & year <=2017)

 

The “Ig74otYAAAAJ&hl” is simply the unique identifier for me which is found in the URL of my Google Scholar profile (https://scholar.google.co.nz/citations?hl=en&user=Ig74otYAAAAJ&hl).

I’ve also been able to scrape the list of top 20 journals and their h-index data for the 260 sub-categories from Google Scholar.  Here is what Cardiology looks like:

Google Scholar’s tops 20 journals for Cardiology as at 13 July 2018: https://scholar.google.co.nz/citations?view_op=top_venues&hl=en&vq=med_cardiology

So, how do we use all this data to compare academics without them having to submit screeds of data themselves?  All that needs is for them to be registered with their Google Scholar identity and for there to be an appropriate formula for comparing academics.  Such a formula is likely to have several components:

  1. Points for ranking within a category. For example, 20 pts for a publication ranked first in a subcategory, down to 1 pt for a publication ranked 20th and, say, 0.5 pts for ones not ranked.
  2. Points that reflect the number of citations a paper has received relative to the h-index for that journal and with a factor that accounts for the age of the paper (because papers published earlier are likely to be cited more).  For example, #citations/Journals 5y h-index * 2/age[y] * 20.  I use 20 just to make it have some similar value to that of the ranking in point 1 above.
  3. Points that reflect the author’s contribution.  Perhaps 20 for first author, 16 second, 12, 8, and 4 for the rest + a bonus 4 for being Senior author at the end.

Here’s a couple examples of mine from the last 6 years:

Pickering JW, Endre ZH. New Metrics for Assessing Diagnostic Potential of Candidate Biomarkers. Clinical Journal Americac Society Nephrology (CJASN) 2012;7:1355–64. Citations 101.

The appropriate sub-category is “Urology & Nephrology” (though I wonder why these are grouped together, I’ve published in many Nephrology, but never a Urology journal).

  1. Ranking:  12 points.    [CJASN is ranked 8th, so 20-8 = 12]
  2. Citations:  10.8 points. [ CJASN 5y h-index is 62. Paper is 6 years old. 101/62 * 2/6 * 20 =10.8]
  3. Author: 20 points [ 1st author]
  4. TOTAL: 42.8

Similarly for:

Flaws D, Than MP, Scheuermeyer FX, … Pickering JW, Cullen, L. External validation of the emergency department assessment of chest pain score accelerated diagnostic pathway (EDACS-ADP). Emerg Med J (EMJ) 2016;33(9):618–25. Citations 10.

The appropriate sub-category is “Emergency Medicine”  (though I wonder why these are grouped together, I’ve published in many Nephrology, but never a Urology journal).

  1. Ranking:  12 points.    [EMJ is ranked 8th, so 20-8 = 12]
  2. Citations:  10.8 points. [ EMJ 5y h-index is 36. Paper is 2 years old. 10/36 * 2/2 * 20 =5.6]
  3. Author: 4 points [ I’m not in the top 4 authors or senior author]
  4. TOTAL: 26.8 pts

This exercise for every academic could be done by one person with some coding skills.  I’m sure it could be calibrated to previous results and funding allocations by taking citations and papers for an earlier period. There may need to be tweaks to account for other kinds of academic outputs than just journal articles, but there are plenty of metrics available.

To summarise, I have just saved the country many millions of dollars and allowed academics to devote their time to what really matters.  All it needs now is for the decision makers to open their eyes and see the possibilities.

(ps. even easier would be to use the research component of the Times Higher Education World University Rankings and be done with it).

More on the PBRFs new clothes

A few of weeks ago I outed the multi-million-dollar exercise that is the Quality Evaluation component of the performance based research fund (PBRF) as a futile exercise because there was no net gain in research dollars for the NZ academic community.  Having revealed the Emperor’s new clothes, I awaited the call from the Minister in charge to tell me they’d cancelled the round out of futility.  When that didn’t come, I pinned my hope on a revolt by the University Vice-Chancellors. Alas, the VCs aren’t revolting.  This week, my goal is for there to be mass resignations from the 30 or so committees charged with assessing the evidence portfolios of individual academics and for individual academics to make last minute changes to their portfolios so as to maintain academic integrity.

I love academic metrics – these ways and means of assessing the relative worth of an individual’s contribution to academia or of the individual impact of a piece of scholarly work are fun.  Some are simple, merely the counting of citations to a particular journal article or book chapter, others are more complex such as the various forms of the h-index. It is fun to watch the number of a citations of an article gradually creep up and to think “someone thinks what I wrote worth taking notice of”.  However, these metrics are largely nonsense and should never be used to compare academics.  Yet, for PBRF and promotions we are encouraged to talk of citations and other such metrics.  Maybe, and only maybe, that’s OK if we are comparing how well we are performing this year against a previous year, but it is not OK if we are comparing one academic against another.  I’ve recently published in both emergency medicine journals and cardiology journals.  The emergency medicine field is a small fraction the size of cardiology, and, consequently, there are fewer journals and fewer citations.  It would be nonsense to compare citation rates for an emergency medicine academic with that of a cardiology academic.

If the metrics around individual scholars are nonsense, those purporting to assess the relative importance (“rank”) of an academic journal are total $%^!!!!.  The most common is the Impact Factor, but there are others like the 5-year H-index for a journal.  To promote them, or use them, is to chip away at academic integrity.  Much has been written elsewhere about impact factors.  They are simply an average of a skewed distribution.  I do not allow students to report data in this way.  Several Nobel prize winners have spoken against them.  Yet, we are encouraged to let the assessing committees know how journals rank.

Even if the citation metrics and impact factors were not dodgy, then there is still a huge problem that faces the assessing committee, and that is they are called on to compare apples with oranges.  Not all metrics are created equal.  Research Gate, Google Scholar, Scopus and Web of Science all count citations and report h-indices.  No two are the same.  A cursory glance at some of my own papers sees a more than 20% variation in counts between them.  I’ve even paper with citation counts of 37, 42, 0 and 0.  Some journals are included, some are not depending on how each company has set up their algorithms. Book chapters are not included by some, but are by others. There are also multiple sites for ranking journals using differing metrics.  Expecting assessing committees to work with multiple metrics which all mean something different is like expecting engineers to build a rocket but not to allow them to use a standard metre rule.

To sum up, PBRF Evidence Bases portfolio assessment is a waste of resources, and encourages use of integrity busting metrics that should not be used to rank individual academic impact.

Cheesecake Files: The ICare-Acute Coronary Syndrome (heart attack) study

Hundreds of nurses, Emergency Department doctors, Cardiologists and other specialists, laboratory staff, administrators and managers from every hospital in New Zealand with an emergency department have come together to implement new, effective, and safe pathways for patients who think they may be having a heart attack.  Today, Dr Martin Than (CDHB, Emergency Department) presented to the American Heart Association results of our research into the national implementation of clinical pathways that incorporate an accelerated diagnostic protocol (ADP) for patients with possible heart attacks.  Simultaneously, a paper detailing that research is appearing in the academic journal Circulation.

The headlines, are that in the 7 hospitals we monitored (representing about 1/3rd of all ED admissions in NZ a year), there was a more than two fold increase in the numbers of patients who were safely discharged from the ED within 6 hours of arrival and told “It’s OK, you are not having a heart attack”.

Improving Care processes for patients with a possible heart attack.

Why is this important?

About 65,000 of the 1 million presentations to EDs each year in New Zealand are for patients whom the attending doctors think may be having a heart attack.  However, only 10-15% of those 65,000 are actually having a heart attack.  The traditional approach to assessment is long, drawn out, involves many resources, and means thousands of people are admitted into a hospital ward even thought it turns out they are not having a heart attack.  Of course, this means that they and their families have a very uncomfortable 24 hours or so wondering what is going on.  So, any method that safely helps to reassure and return home early some of those patients is a good thing.

What is a clinical pathway?

A clinical pathway is a written document based on best practice guidelines that is used by physicians to manage the course of care and treatment of patients with a particular condition or possible condition.  It is intended to standardise and set out the time frame for investigation and treatment within a particular health care setting – so it must take into account the resources available for a particular hospital.   For example, each hospital must document how a patient is assessed and if, for example, they are assessed within the ED as having a high-risk of a heart attack, where they must go.  In a large metropolitan hospital, this may mean simply passing them into the care of the cardiology department.  In a smaller setting like Taupo, where there is  no cardiology department, it may mean documenting when and how they are transported to Rotorua or Waikato hospital.

What is an accelerated diagnostic protocol?

An accelerated diagnostic protocol (ADP) is a component of the clinical pathway that enables the ED doctors to more rapidly and consistently make decisions about where to send the patient.  In all cases in New Zealand the ADPs for evaluating suspected heart attacks have 3 main components: (i) an immediate measurement of the electrical activity of the heart (an ECG), (ii) an immediate blood sample to look for the concentration of a marker of heart muscle damage called troponin, and a second sample 2 or 3 hours later, and (iii) a risk score based on demographics, prior history or heart conditions, smoking etc., and the nature of the pain (ie where it hurts and does it hurt when someone pushes on the chest, or when the patient takes deep breaths etc).   Importantly, these components enable a more rapid assessment of patients than traditionally and, in-particularly, enable patients to be rapidly risk stratified into low-risk, intermediate risk, and high-risk groups.  Usually the low-risk patients can be sent home.

What was done?

The Ministry of Health asked every ED to put in place a pathway.  Over an ~18 month period, a series of meetings were held at each hospital which were led by Dr Than, the clinical lead physician for the project.  Critically, at each meeting there were multiple members of the ED (doctors and nurses), cardiology, general wards, laboratory staff, and hospital administrators.  The evidence for different ADPs was presented.  Each hospital had to assess this evidence themselves and decide on the particularly ADP they would use.  Potential barriers to implementation and possible solutions were discussed.  Critically, champions for different aspects of the pathway implementation process were identified in each hospital.  These people led the process internally.

Oversight of the implementation was an adhoc advisory board put together by the Ministry of Health and with MoH officials, Dr Than, Cardiologists, and myself.

The Improving Care processes for patients with suspected Acute Coronary Syndrome (ICare-ACS) study was a Health Research Council sponsored study with co-sponsorship of staff time by participating hospitals.  Its goal was to measure any changes in each hospital to the proportions of patients who were being discharged home from ED early and to check whether they were being discharged safely (ie to check that there were not people with heart attacks were being sent home).  Dr Than and I co-led this project, but there were many involved who not only set up the pathways in each of the 7 participating study hospitals, but who also helped with attaining the data for me to crunch.

What were the study results?

In the pre-clinical pathway implementation phase (6 months for each hospital) there were 11,529 patients assessed for possible heart attack. Overall, 8.3% of them were sent home within 6 hours of arrival (we used 6 hours because this is a national target for having patients leave the ED).  The proportions of patients sent home varied considerably between hospitals – from 2.7% to 37.7%.  Of those sent home early, a very small proportion (0.52%) had what we call a major adverse event (eg a heart attack, a cardiac arrest, or death for any reason) within 30 days.  This is actually a very good number (it is practically impossible to be 0%).

We monitored each hospital for at least 5 months after pathway implementation and a median of 10.6 months.  Of the 19,803 patients, 18.4% were sent home within 6 hours of arrival.  ie the pathway more than doubled the number of patients who were sent home early.  Importantly, all 7 of the hospitals sent more patients home earlier.  The actual percentages sent home in each hospital still varied, showing there are more further improvements to be made in some hospital than others.  Very importantly, the rate of major adverse events in those sent home remained very low (0.44%).  Indeed, when we looked in detail at the few adverse events, in most cases there was a deviation from the local clinical pathway.  This suggests that some ongoing education and “embedding in” of the pathways may improve safety even more.

The study also showed that amongst all patients without a heart attack the implementation of the pathway reduced the median length of stay in hospital by nearly 3 hours.  Using crude numbers for the cost of an acute event in a hospital I estimate that this is a saving to the health system of $9.5Million per year.  These types are calculations are difficult and full of assumptions, nevertheless, I can be confident that the true savings are in the millions (pst… Government… I wouldn’t mind a fraction of this saving to carry on research please).

How did this come about?

This study and the pathway implementation is the result of a decade long series of studies in Christchurch hospital and some international studies, particularly with colleagues in Brisbane.  These studies have involved ED staff, cardiologists, research nurses, University of Otago academics (particularly those in the Christchurch Heart Institute) and many others.  They began with an international onbservational study which measured troponin concentrations at earlier than normal time points to see whether they gave information that would enable earlier discharge of some patients.  This was followed by the world’s first randomised trial of an ADP verse standard (then) practice.  That showed that the ADP resulted in more patients being safely sent home.  It was immediately adopted as standard practice in Christchurch.  The ADP was refined with a more “fit for purpose” risk assessment tool (called EDACS – developed locally and with collaboration of colleagues in Brisbane).  The EDACS protocol was then compared to the previous protocol (called ADAPT) in a second randomised trial.  It was at least as good with potential for discharging safely even more patients.  It is currently standard practice in Christchurch.

As a consequence of the Christchurch work, the Ministry of Health said, effectively,  ‘great, we want all of New Zealand to adopt a similar approach’, and the rest, as they say, is history.  Now, all EDs have a clinical pathway in place, all use an evidence based ADP – two use the ADAPT and all the rest use EDACS with one exception which uses a more ‘troponin centric’ approach (still evidence based) which I won’t go into here.  Meanwhile, all of Queensland has adopted the ADAPT approach and we know of many individual hospitals in Australia, Europe and Iran (yes) which have adopted EDACS.

Other help

As mentioned already, the Health Research Council and the Ministry of Health along with all those medical professionals were integral to getting to where we are today.  Also integral, were all those patients who in the randomised trials agreed to participate.  Medical research is build on the generosity of the patient volunteer.  Behind the scenes is our research manager, Alieke, who ensures doctors run on time.  Finally, I am very fortunate to be the recipient of a research fellowship that enables me to do what I do.  I thank my sponsors, the Emergency Care Foundation, Canterbury Medical Research Foundation, and Canterbury District Health Board.  Some of the earlier work has also been done in part with my University of Otago Christchurch hat on.  Thank you all.

Half a million Kiwis suddenly have high blood pressure

At 10am 14 November 2017 NZST millions of people around the world suddenly had high blood pressure. This will come as a shock to many and may precipitate a crisis in hand wringing and other odd behaviour, like over medication and jogging.

The American Heart Association and American College of Cardiology have just announced a redefinition of High blood pressure.

High blood pressure is now defined as readings of 130 mm Hg and higher for the systolic blood pressure measurement, or readings of 80 and higher for the diastolic measurement. That is a change from the old definition of 140/90 and higher, reflecting complications that can occur at those lower numbers. (link)

Announced at the annual American Heart Association conference, this is bound to cause some consternation.  It shifts 14% of the US adult population into the “High blood pressure” category and I estimate that it will do something similar for the NZ population meaning half a million New Zealanders who didn’t have High blood pressure at 9am now have high blood pressure (assuming NZ cardiologists follow their US colleagues).

While this is, of course, absurd.  It also highlights the seriousness with which the cardiologists take elevated blood pressure – maybe we all should take it a bit more seriously, perhaps park the care further from work and walk a little (likely to be cheaper too).

Have you got high blood pressure. (c) American Heart Association