Articles

Read the latest articles relevant to your clinical practice, including exclusive insights from Healthed surveys and polls.

By reading selected clinical articles, you earn CPD in the Educational Activities (EA) category whenever you click the “Claim CPD” button and follow the prompts. 

Dr Linda Calabresi

Faecal transplantation has been gaining momentum as a mainstream treatment over recent years, but now a systematic review published in the MJA puts it ahead of antibiotics in effectiveness against Clostridium difficile-associated diarrhoea. The literature search examined all the randomised controlled trials on the topic up until February this year, including some recently published studies, and concluded there was moderate quality evidence that faecal microbiota transplantation is more effective in patients with Clostridium difficile-associated diarrhoea than either vancomycin or placebo. The review also found that samples that had been frozen and then thawed prior to transplantation were as effective as fresh samples. “Our systematic review also highlights the fact that frozen/thawed transplants – a more convenient approach that reduces the burden on a donor to supply a sample on the day it is needed – is as effective as fresh [faecal microbiota transplant],” the authors said. However, there was less clarity about the optimal method of administering the transplanted microbiota. “Our analysis indicates that naso-duodenal and colonoscopic application may be more effective than retention enemas, but this conclusion relies on indirect comparisons of subgroups,” they concluded suggesting that further research was needed to determine the best route of administration. There also needs to be more evidence into the most appropriate donor – whether they should be related, unrelated or anonymous, or whether ‘pooling stool from several donors’ would be the best way to go. “Over the past 20 years the worldwide incidence of [Clostridium difficile-associated diarrhoea] has more than doubled, and outbreaks have been associated with greater morbidity and mortality, although to a lesser extent in Australia,” the study authors said. Even though recent guidelines from Europe and North America now recommend these transplants to treat antibiotic-resistant Clostridium difficile-associated diarrhoea, the international authors of the review said these recommendations were based on relatively poor evidence. It is expected this systematic review that includes more scientifically robust clinical trials will inform future guidelines on the topic, particularly in Australia and New Zealand whose guidelines on treating Clostridium difficile-associated diarrhoea currently need updating. Ref: doi: 10.5694/mja17.00295

Prof Graeme Suthers

Examining the structure of chromosomes The first studies in human genetics were done in the early 1900s, well before we had any idea of the structure of DNA or chromosomes. It was not until the late 1950s that the double helix was deciphered, that we realised that chromosomes were large bundles of DNA, and that we were able to visualise the number and shape of chromosomes under the microscope. In just a few years, numerous clinical disorders were identified as being due to abnormalities in the number or shape of chromosomes, and the field of “cytogenetics” was born. Over the next five decades, techniques improved. With the right sample and a good microscope, the laboratory could detect an abnormal gain or loss that was as small as 5-10 million base pairs of DNA on a specific chromosome. The light microscope reigned supreme as the ultimate tool for genetic analysis!

Examining the mass of chromosomes

In the last 10-15 years, a different technology called “microarrays” has challenged the supremacy of the microscope in genetic analysis. There are many different implementations of microarrays, but in essence they are all based on breaking the chromosomes from a tissue sample into millions of tiny DNA fragments, thereby destroying the structural cues used in microscopy. Each fragment then binds to a particular location on a prepared surface, and the amount of bound fragment is measured. The prepared surface, a “microarray”, is only a centimetre across and can have defined locations for millions of specific DNA fragments. The relative amounts of specific fragments can indicate tiny chromosomal regions in which there is a relative deficiency or excess of material. For example, in a person with Down syndrome (trisomy 21), the locations on the microarray that bind fragments derived from chromosome 21 will have 1 ½ times the number of fragments as locations which correspond to other chromosomes (three copies from chromosome 21 versus two copies from other chromosomes). The microarray could be regarded as examining the relative mass, rather than the shape, of specific chromosomal regions. Current microarrays can identify loss or gain of chromosomal material that is 10-100 times smaller than would be visible with the microscope. This has markedly improved the diagnostic yield in many situations but, as described below, conventional cytogenetics by light microscopy still has a role to play.

Microarrays in paediatrics

Conventional cytogenetics will identify a chromosome abnormality in 3-5% of children with intellectual disability or multiple malformations. A microarray will identify the same abnormality in those children, plus abnormalities in a further 10-15% i.e. the yield from microarray studies is approximately 15-20% (1). For this reason, microarray studies are the recommended type of cytogenetic analysis in the investigation of children or adults with intellectual disability or multiple malformations. There is a specific Medicare item for “diagnostic studies of a person with developmental delay, intellectual disability, autism, or at least two congenital abnormalities” by microarray. Requestors should request microarray analysis (item 73292) rather than use the less specific request for chromosome studies (item 73289). There are three cautions about microarray studies in this setting. First, a microarray will not detect every familial disorder. Intellectual disability due to a single gene disorder e.g. fragile X syndrome, will not be detected by a microarray. Second, experience with microarrays has demonstrated that some gains and losses of genetic material are benign and familial. It may be necessary to test the parents as well as the child to clarify the clinical significance of an uncommon change identified by microarray; the laboratory would provide guidance in such instances. And third, a microarray may identify an unexpected abnormality that has clinical consequences other than those which triggered the investigation.

Microarrays in antenatal care

The use of microarrays to investigate children with multiple malformations has now been extended to the investigation of fetuses with malformations. By using microarrays rather than conventional microscopy, the diagnostic yield from antenatal cytogenetics has increased by 6%(2). The cautions noted above still apply i.e. a microarray cannot detect every genetic cause of malformations, and determining the clinical significance of an uncommon finding may require additional studies. Microarrays can also be useful in the investigation of miscarriage and stillbirth. Most miscarriages are due to chromosome abnormalities which occur during the formation of the sperm or egg, or during early embryogenesis(3). These abnormalities are not inherited from either parent and hence do not constitute a hazard in subsequent pregnancies. Many clinicians and couples wish to confirm that a miscarriage was due to a sporadic chromosome abnormality that carries little risk for a subsequent pregnancy. This analysis can be done by either microarray or microscopic analysis of the products of conception. Microscopic analysis requires viable tissue, and up to 30% of studies may fail. Microarray analysis is preferred because it has better resolution and does not require living cells; as a result, the yield from microarray analysis is much higher(2). Requesters should specifically request microarray analysis, utilising the non-specific MBS item (73287).

Situations in which microarrays should not be used

There are two important antenatal situations in which microarrays should not be used: preconception screening, and investigation after a high risk non-invasive prenatal testing (NIPT) result. As noted above, a microarray measures the relative amount of genetic material from a specific location on a chromosome; it does not evaluate the shape of that chromosome. Approximately 1:1,000 healthy people has a balanced translocation i.e. part of one chromosome is attached to a different chromosome. The overall amount of genetic material is normal and there is usually no clinical consequence of this rearrangement. A balanced translocation would not be detected by microarray because there is not net gain or loss of chromosomal material. Microscopic analysis is likely to detect the translocation because of the change in shape of the two chromosomes involved. A person with a translocation can produce eggs or sperm that are unbalanced, having an abnormal gain or loss of chromosome material. This can cause infertility, recurrent miscarriages, or the birth of a child with intellectual disability or malformations. The unbalanced abnormality in the child would be detected by microarray, but the balanced precursor in the parent would not. For this reason, cytogenetic investigation of infertility and recurrent miscarriages requires microscopic cytogenetic studies of both partners (MBS item 73289). Approximately 4% of couples with recurrent miscarriages are found to have a balanced translocation in one or both partners. For similar reasons, microarray testing is not recommended for follow-up studies of CVS or amniotic fluid after a high risk result from NIPT. A microarray would identify the trisomy, but may not detect the rare instance of trisomy due to a familial translocation. Prenatal testing for autosomal trisomy requires microscopic cytogenetic studies (MBS item 73287).

The future of microarrays

Rapid developments in DNA sequencing have raised the possibility that microarrays will themselves be displaced as the preferred method of cytogenetic analysis(4). It is already possible to replicate many of the functions of a microarray by advanced sequencing methods. However, the microarray currently has the advantages of precision, reproducibility, and affordability that will ensure its continuing use for at least the next few years. And, as already demonstrated above, there may still be clinical questions that require the older methods. Cytogenetics is changing, but it is not dead. Sonic Genetics offers cytogenetic studies by both microscopic and microarray methods. General Practice Pathology is a new fortnightly column each authored by an Australian expert pathologist on a topic of particular relevance and interest to practising GPs. The authors provide this editorial, free of charge as part of an educational initiative developed and coordinated by Sonic Pathology. References
  1. Miller DT, Adam MP, Aradhya S, Biesecker LG, Brothman AR, Carter NP, et al. Consensus statement: chromosomal microarray is a first-tier clinical diagnostic test for individuals with developmental disabilities or congenital anomalies. Am J Hum Genet. 2010 May 14;86(5):749–64.
  2. Dugoff L, Norton ME, Kuller JA. The use of chromosomal microarray for prenatal diagnosis. Am J Obstet Gynecol. 2016;215(4):B2–9.
  3. van den Berg MMJ, van Maarle MC, van Wely M, Goddijn M. Genetics of early miscarriage. Biochim Biophys Acta - Mol Basis Dis. 2012;1822(12):1951–9.
  4. Downie L, Donoghue S, Stutterd C. Advances in genomic testing. Aust Fam Physician. 2017;46(4):200–4.
Dr Linda Calabresi

It is well-known that when a patient with depression is commenced on antidepressants and they are effective, they should continue them for at least a year to lower their risk of relapse. The guidelines are pretty consistent on that point. But what about anxiety disorders? Along with cognitive behavioural therapy, antidepressants are considered a first-line option for treating anxiety conditions such as generalised anxiety disorder, obsessive-compulsive disorder and post-traumatic disorder. Antidepressants have been shown to generally effective and well-tolerated in treating these illnesses. But how long should they be used in order to improve long-term prognosis? Internationally, guidelines vary in their recommendations. If the treatment is effective the advice has been to continue treatment for variable durations (six to 24 months) and then taper the antidepressant, but this has been based on scant evidence. To clarify this recommendation, Dutch researchers conducted a meta-analysis of 28 relapse prevention trials in patients with remitted anxiety disorders. Their findings, recently published in the BMJ, support the continuation of pharmacotherapy. “We have shown a clear benefit of continuing treatment compared with discontinuation for both relapse… and time to relapse”, the authors stated. In addition, the researchers found the relapse risk was not significantly influenced by the type of anxiety disorder, whether the antidepressant was tapered or stopped abruptly or whether the patient was receiving concurrent psychotherapy However, because of the duration of the studies included in the meta-analysis, only the advice to continue antidepressants for at least a year could be supported by evidence. After this, the researchers said there was no evidence-based advice that could be given. “[However] the lack of evidence after this period should not be interpreted as explicit advice to discontinue antidepressants after one year,” they said. The researchers suggested that those guidelines that advise antidepressant should be tapered after the patient has achieved a sustained remission should be revised. In fact, they said, there were both advantages and disadvantages to continuing treatment beyond a year, and more research was needed to help clinicians assess an individual’s risk of relapse. This is especially important as anxiety disorders are generally chronic and there have been indications that in some patients, the antidepressant therapy is less effective when reinstated after a relapse. “When deciding to continue or discontinue antidepressants in individual patients, the relapse risk should be considered in relation to side effects and the patient’s preferences,” they concluded. Ref: BMJ 2017;358:j392 doi:10.1136/bmj:j3927

Dr Joyce Wu

Non-fasting specimens are now acceptable Fasting specimens have traditionally been used for the formal assessment of lipid status (total, LDL and HDL cholesterol and triglycerides). In 2016, the European Atherosclerosis Society and the European Federation of Clinical Chemistry and Laboratory Medicine released a joint consensus statement that recommends the routine use of non-fasting specimens for the assessment of lipid status.2 Large population-based studies were reviewed which showed that for most subjects the changes in plasma lipids and lipoproteins values following food intake were not clinically significant. Maximal mean changes at 1–6 hours after habitual meals were found to be: +0.3 mmol/L for triglycerides; -0.2 mmol/L for total cholesterol; -0.2 mmol/L for LDL cholesterol; -0.2 mmol/L for calculated non-HDL cholesterol and no change for HDL cholesterol. Additionally, studies have found similar or sometimes superior cardiovascular disease risk associations for non-fasting compared with fasting lipid test results. There have also been large clinical trials of statin therapy, monitoring the efficacy of treatment using non-fasting lipid measurements. Overall, the evidence suggests that non-fasting specimens are highly effective in assessing cardiovascular disease risk and treatment responses.

Non-HDL cholesterol as a risk predictor

In the 2016 European joint consensus statement2 and in previously published guidelines and recommendations, the clinical utility of non-HDL cholesterol (calculated from total cholesterol minus HDL cholesterol) has been noted as a predictor of cardiovascular disease risk. Moreover, this marker has been found to be more predictive of cardiovascular risk when determined in a non-fasting specimen.

What this means for your patients

The assessment of lipid status with a non-fasting specimen has the following benefits:
  • No patient preparation is required, thereby reducing non-compliance
  • Greater convenience with attendance for specimen collection at any time
  • Reports are available for earlier review instead of potential delays associated with obtaining fasting results

Indications for repeat testing or a fasting specimen collection

For some patients, lipid testing on more than one occasion may be necessary in order to establish their baseline lipid status. It is also important to note that an assessment of lipid status carried out in the presence of any intercurrent illness may not be valid. Conditions for which a fasting specimen collection is recommended2 include:
  • Non-fasting triglyceride >5.0 mmol/L
  • Known hypertriglyceridaemia followed in a lipid clinic
  • Recovering from hypertriglyceridaemic pancreatitis
  • Starting medications that may cause severe hypertriglyceridaemia (e.g., steroid, oestrogen, retinoid acid therapy)
  • Additional laboratory tests are requested that require fasting or morning specimens (e.g., fasting glucose, therapeutic drug monitoring)

Lipid reference limits and target levels for treatment are under review

The chemical pathology community in Australia is currently reviewing all relevant publications in order to implement a consensus approach to reporting and interpreting lipid results. This includes the guidelines for management of absolute cardiovascular disease risk developed by the National Vascular Disease Prevention Alliance (NVDPA).3

Further information

  • Absolute cardiovascular disease risk calculator is available atwww.cvdcheck.org.au
  • If familial hypercholesterolaemia is suspected, e.g. LDL cholesterol persistently above 5.0 mmol/L in adults, then advice about diagnosis and management is available at www.athero.org.au/fh
References
  1. Rifai N, et al. Non-fasting Sample for the Determination of Routine Lipid Profile: Is It an Idea Whose Time Has Come? ClinChem 2016;62: 428-35.
  2. Nordestgaard BG, et al. Fasting Is Not Routinely Required for Determination of a Lipid Profile: Clinical and Laboratory Implications Including Flagging at Desirable Concentration Cutpoints -A Joint Consensus Statement from the European Atherosclerosis Society and European Federation of Clinical Chemistry and Laboratory Medicine. Clin Chem 2016;62: 930-46.
  3. National Vascular Disease Prevention Alliance, Absolute cardiovascular disease management, Quick reference guide for health professionals

General Practice Pathology is a new fortnightly column each authored by an Australian expert pathologist on a topic of particular relevance and interest to practising GPs. The authors provide this editorial, free of charge as part of an educational initiative developed and coordinated by Sonic Pathology.
Dr Linda Calabresi

New guidelines suggest excising a changing skin lesion after one month As with facing an exam where you haven’t studied, or finding yourself naked in a public place – missing a melanoma diagnosis is the stuff of nightmares for most GPs. In a condition where the prognosis can vary dramatically according to a fraction of a millimetre, the importance of early detection is well-known and keenly felt by clinicians. According to new guidelines published in the MJA, Australian doctors’ ability to detect classical melanomas early has been improving as evidenced by both the average thickness of the tumour when it is excised and the improved mortality rates associated with these types of tumours. Unfortunately, however the atypical melanomas are still proving a challenge. Whether they be nodular, occurring in an unusual site or lacking the classic pigmentation, atypical melanomas are still not being excised until they are significantly more advanced and consequently the prognosis associated with these lesions remains poor. As a result, a Cancer Council working group have revised the clinical guidelines on melanoma, in particular focusing on atypical presentations. The upshot of their advice? If a patient presents with any skin lesion that has been changing or growing over the course of a month, that lesion should be excised. The Australian guideline authors suggest that in addition to assessing lesions according to the ABCD criteria (asymmetry, border irregularity, colour variegation, and diameter >6mm) we should add EFG (elevated, firm and growing) as independent indicators of possible melanoma. “Any lesion that is elevated, firm and growing over a period of more than one month should be excised or referred for prompt expert opinion,” they wrote. In their article, the working group do acknowledge that it is not always a delayed diagnosis that is to blame for atypical melanomas being commonly more advanced when excised. Some of these tumours, such as the nodular and desmoplastic subtypes can grow very rapidly. “These subtypes are more common on chronically sun-damaged skin, typically on the head and neck and predominantly in older men,” the authors said. However, the most important common denominator with melanomas is that they are changing, they concluded. A history of change, preferably with some documentation of that change such as photographic evidence should be enough to raise the treating doctor’s index of suspicion. “Suspicious raised lesions should be excised rather than monitored,” they concluded. Ref: MJA Online 9.10.17 doi:10.5694/mja17.00123

Prof Louise Newman

The Australian newspaper recently reported the royal commission investigating institutional child sex abuse was advocating psychologists use “potentially dangerous” therapy techniques to recover repressed memories in clients with history of trauma. The reports suggest researchers and doctors are speaking out against such practices, which risk implanting false memories in the minds of victims. The debate about the nature of early trauma memories and their recovery isn’t new. Since Sigmund Freud developed the idea of “repression” – where people store away memories of stressful childhood events so they don’t interfere with daily life – psychologists and law practitioners have been arguing about the nature of memory and whether it’s possible to create false memories of past situations. Recovery from trauma for some people involves recalling and understanding past events. But repressed memories, where the victim remembers nothing of the abuse, are relatively uncommon and there is little reliable evidence about their frequency in trauma survivors. According to reports from clinical practice and experimental studies of recall, most patients can partially recall events, even if elements of these have been suppressed.

What are repressed memories?

The concept of repressing traumatic memories was part of this model. Repression, as Freud saw it, is a fundamental defensive process where the mind forgets or places events, thoughts and memories we cannot acknowledge or bear elsewhere.Freud introduced the concept that child abuse is a major cause of mental disorders such as hysteria, also known as conversion disorder. People with these disorders could lose bodily functions, such as the ability to move one of their limbs, following a stressful event. Freud also suggested that if these memories weren’t recalled, it could result in physical or mental symptoms. He argued symptoms of a mental disorder can be a return of the repressed memories, or a symbolic way of communicating a traumatic event. An example would be suddenly losing speech ability when someone has a terrible memory of trauma they feel unable to disclose. This idea of hidden traumas and their ability to influence psychological functioning despite not being recalled or available to consciousness has shaped much of our current thinking about symptoms and the need to understand what lies behind them. Those who accept the repression interpretation argue children may repress memories of early abuse for many years and that these can be recalled when it’s safe to do so. This is variously referred to as traumatic amnesia or dissociative amnesia. Proponents accept repressed traumatic memories can be accurate and used in therapy to recover memories and build up an account of early experiences.
Read more: Dissociative identity disorder exists and is the result of childhood trauma

False memory and the memory wars

Freud later withdrew his initial ideas around abuse underlying mental health disorders. He instead drew on his belief of the child’s commonly held sexual fantasies about their parents, which he said could influence formation of memories that did not did not mean actual sexual behaviour had taken place. This may have been Freud caving in to the social pressures of his time. This interpretation lent itself to the false memory hypothesis. Here the argument is that memory can be distorted, sometimes even by therapists. This can influence the experience of recalling memories, resulting in false memories. Those who hold this view oppose therapy approaches based on uncovering memories and believe it’s better to focus on recovery from current symptoms related to trauma. This group point out that emotionally traumatic memory can be more vividly remembered than non-traumatic memories, so it wouldn’t hold these events would be repressed. They remain sceptical about reclaimed memories and even more so about therapies based on recall – such as recovered memory therapy and hypnosis. The 1990s saw the height of these memory wars, as they came to be known, between proponents of repressed memory and those of the false memory hypothesis. The debate was influenced by increasing awareness and research on memory systems in academic psychology and an attitude of scepticism about therapeutic approaches focused on encouraging recall of past trauma. In 1992, the parents of Jennifer Freyd, who had accused her father of sexual assault, founded the False Memory Syndrome Foundation. The parents maintained Jennifer’s accusations were false and encouraged by recovered memory therapy. While the foundation has claimed false memories of abuse are easily created by therapies of dubious validity, there is no good evidence of a “false memory syndrome” that can be reliably defined, or any evidence of how widespread the use of these types of therapies might be.
Read more: We’re capable of infinite memory, but where in the brain is it stored, and what parts help retrieve it?

An unhelpful debate

Both sides do agree that abuse and trauma during critical developmental periods are related to both biological and psychological vulnerability. Early trauma creates physical changes in the brain that predispose the individual to mental disorders in later life. Early trauma has a negative impact on self-esteem and the ability to form trusting relationships. The consequences can be lifelong. A therapist’s role is to help abuse survivors deal with these long-term consequences and gain better control of their emotional life and interpersonal functioning. Some survivors will want to have relief from ongoing symptoms of anxiety, memories of abuse and experiences such as nightmares. Others may express the need for a greater understanding of their experiences and to be free from feelings of self-blame and guilt they may have carried from childhood. Some individuals will benefit from longer psychotherapies dealing with the impact of child abuse on their lives. Most therapists use techniques such as trauma-focused cognitive behavioural therapy, which aren’t aimed exclusively at recovering memories of abuse. The royal commission has heard evidence of the serious impact of being dismissed or not believed when making disclosures of abuse and seeking protection. The therapist should be respectful and guided by the needs of the survivor.
Read more: Why does it take victims of child sex abuse so long to speak up?
Right now, we need to acknowledge child abuse on a large scale and develop approaches for intervention. It may be time to move beyond these memory wars and focus on the impacts of abuse on victims; impacts greater than the direct symptoms of trauma. The ConversationIt’s vital psychotherapy acknowledges the variation in responses to trauma and the profound impact of betrayal in abusive families. Repetition of invalidation and denial should be avoided in academic debate and clinical approaches. Louise Newman, Director of the Centre for Women’s Mental Health at the Royal Women’s Hospital and Professor of Psychiatry, University of Melbourne This article was originally published on The Conversation. Read the original article.
Dr Linda Calabresi

Looks like there is yet another reason to rethink the long-term use of proton pump inhibitors. And this one is a doozy. According to a new study, recently published in the BMJ journal, Gut, the long-term use of PPIs is linked to a more than doubling of the risk of developing stomach cancer. And before you jump to the reasonable conclusion that these patients might have had untreated Helicobacter Pylori, this 2.4 fold increase in gastric cancer risk occurred in patients who had had H.pylori but had been successfully treated more than 12 months previously. What’s more, the risk increased proportionally with the duration of PPI use and the dose, which the Hong Kong authors said suggested a cause-effect relationship. No such increased risk was found among those patients who took H2 receptor antagonists. While the study was observational, the large sample size (more than 63,000 patients with a history of effective H.pylori treatment) and the relatively long duration of follow-up (median 7.6 years) lent validity to the findings. The link between H.pylori and gastric cancer, has been known for decades. It has been shown that eradicating H.pylori reduces the risk of developing gastric cancer by 33-47%. However, the study authors said, it is also known that a considerable proportion of these individuals go on to develop gastric cancer even after they have successfully eradicated the bacteria. “To our knowledge, this is the first study to demonstrate that long-term PPI use, even after H. pylori eradication therapy, is still associated with an increased risk of gastric cancer,” they said. By way of explanation, the researchers note that gastric atrophy is considered a precursor to gastric cancer. And while gastric atrophy is a known sequela of chronic H. pylori infection, it could also be worsened and maintained by the profound acid suppression associated with PPI use and this could be why the risk persisted even after the infection had been treated. Bottom line? According to the study authors, doctors need to ‘exercise caution when prescribing long-term PPIs to these patients even after successful eradication of H. pylori.’ Ref: Gut 2017; 0:1-8. Doi:10.1136/gutjnl-2017-314605

Dr Daman Langguth

Research in rheumatoid arthritis (RA) over the past 10 years has gained significant ground in both pathophysiological and clinical understanding. It is now known that early aggressive therapy within the first three months of the development of joint symptoms decreases the chance of developing severe disease, both clinically and radiologically. To enable this early diagnosis, there has been considerable effort made to discover serological markers of disease. Around 80% of RA patients become rheumatoid factor positive (IgM RF), though this can take many years to occur. In other words, IgM RF (hereafter called RF) has low sensitivity in the early stages of RA. Furthermore, patients with other inflammatory diseases (including Sjögren’s syndrome, chronic viral and bacterial infections) may also be positive for RF, and thus RF has a relatively low specificity for RA. The RF is, therefore, not an ideal test in the early detection and confirmation of RA. There has been an on-going search for an auto-antigen in RA over the past 30 years. It has been known that senescent cells display antigens not present on other cells, and that RA patients may make antibodies against them. This was first reported with the anti-perinuclear factor (APF) antibodies directed against senescent buccal mucosal cells in 1964, but this test was challenging to perform and interpret. These cells were later found to contain filament aggregating protein (filaggrin). Subsequently, in 1979, antibodies directed against keratin (anti-keratin antibodies, AKA) in senescent oesophageal cells were discovered. In 1994, another antibody named anti-Sa was discovered that reacted against modified vimentin in mesenchymal cells. In the late1990s, antibodies directed against citrullinated peptides were ‘discovered’. In fact, we now know that all of the aforementioned antibodies detect similar antigens. When cells grow old, some of the structural proteins undergo citrullination under the direction of cellular enzymes. Arginine residues undergo deamination to form the non-standard amino acid citrulline. Citrullinated peptides fit better into the HLA-DR4 molecules that are strongly associated with RA development, severity and prognosis. It is also known that many types of citrullinated peptides are present in the body, both in and outside joints. It has been determined that sera from individual RA patients contain antibodies that react against different citrullinated peptides, but these individuals’ antibodies do not react against all possible citrullinated peptides. Thus, to improve the sensitivity of the citrullinated peptide assays, cyclic citrullinated peptides (CCP) have been artificially generated to mimic a range of conformational epitopes present in vivo. It is these artificial peptides that are used in the second generation anti-CCP assays. Sullivan Nicolaides Pathology uses the Abbott Architect assay which is standardised against the Axis-Shield, Dundee UK, second generation CCP assay. False positive CCP antibodies have recently been reported to occur in acute viral (e.g. EBV, HIV) and some atypical bacterial (Q Fever) seroconversions. The antibodies may be present for a few months after seroconversion, but do not predict inflammatory arthritis in these individuals.

Anti-CCP assays

CCP antibodies alone give a sensitivity of around 66% in early RA, similar to RF, though they have a much higher specificity of >95% (compared with around 80% for RF). The combination of anti-CCP and RF tests is now considered to be the ‘gold standard’ in the early detection of RA. Combining RF with anti-CCP enables approximately 80% (i.e. 80% sensitivity) of RA patients to be detected in the early phase (less than sixmonths duration) of this disease. The presence of anti-CCP antibodies has also been shown to predict RA patients who will go on to develop more severe joint disease, both radiologically and clinically. They also appear to be a better marker of disease severity than RF. Anti-CCP antibodies have also been shown to be present prior to the development of clinical disease, and thus may predict the development of RA in patients with uncharacterised recent onset inflammatory arthritis. At present, it is not known whether monitoring the level of these antibodies will be useful as a marker of disease control, though some data in patients treated with biologic (e.g. etanercept, infliximab agents) suggests they may be useful. It has not been determined whether the absolute levels of CCP antibodies allow further disease risk stratification. Our pathology laboratories reports CCP antibodies in a quantitative fashion – normal less than 5 U/mL with a range of up to 2000 U/mL. References
  1. ACR Position statement on anti-CCP antibodies http://www.rheumatology.org/publications hotline/1003anticcp.asp.
  2. Forslind K, Ahlmen M, Eberhardt K et al. Prediction of radiologic outcome in early rheumatoid arthritis in clinical practice: role of antibodies to citrullinated peptides (anti-CCP). Ann Rheum Dis 2004; 63:1090-5.
  3. Huizinga TWJ, Amos CI, van der Helm-van Mil AHM et al. Refining the complex rheumatoid arthritis phenotype based on specificity of the HLA-DRB1 Shared epitope for antibodies to citrullinated proteins. Arthritis Rheum 2005; 52:3433-8.
  4. Lee DM, Schur PH. Clinical Utility of the anti-CCP assay in patients with rheumatic disease. Ann Rheum Dis 2003; 62:870-4.
  5. Van Gaalen FA, Linn-Rasker SP, van Venrooij Wj et al. Autoantibodies to cyclic citrullinated peptides predict progression to rheumatoid arthritis in patients with undifferentiated arthritis. Arthritis Rheum 2004;50: 709-15.
  6. Zendman AJW, van Venrooij WJ, Prujin GJM. Use and significance of anti-CCP autoantibodies in rheumatoid arthritis. Rheumatology 2006; 46:20-5.

General Practice Pathology is a new regular column each authored by an Australian expert pathologist on a topic of particular relevance and interest to practising GPs. The authors provide this editorial, free of charge as part of an educational initiative developed and coordinated by Sonic Pathology.
Dr Linda Calabresi

There has been a lot of noise around opioid use lately. In particular, in the States where it’s been declared a public health emergency. While concerted efforts are being made to ensure that patients who are experiencing chronic pain are not also in a position where they also have to deal with opioid addiction, in the cases of severe, acute pain most doctors would consider pain relief the priority and opioids the gold standard. Well it seems that too may need a rethink. According to a new randomised controlled trial just published in JAMA, an oral ibuprofen/paracetamol combination works just as well at reducing pain, such as that felt with a suspected fractured arm as a range of other oral opioid combinations including oxycodone and paracetamol. The US researchers randomly selected over 400 patients who presented to emergency with moderate to severe arm or leg pain, severe enough to warrant investigation by imaging to receive an oral paracetamol/ibuprofen combination pain relief or one of three other opioid combination analgesics including oxycodone/paracetamol, hydrocodone/paracetamol or codeine/paracetamol. Two hours after ingestion there were no statistically significant or clinically important difference in pain reduction between the four groups. A limitation of the study was that it didn’t compare adverse effects, however the study authors said their findings support the use of the paracetamol/ibuprofen combination as an alternative to oral opioid analgesics, at least in cases of severe arm or leg pain. Their findings also contradict the long-held idea that non-opioid pain killers are less effective than opioids, an idea that has been underpinned by the WHO pain ladder that has guided clinicians managing both cancer and non-cancer pain since 1986. Even though most scripts for opioids are written out in the community, previous research has showed that long-term opiate use is higher among those patients who were initially treated in hospital. “Typically, treatment regimens that provide adequate pain reduction in the ED setting are used for pain management at home,” an accompanying editorial stated. “[This trial] provides important evidence that nonopioid analgesia can provide similar pain reduction as opioid analgesia for selected patients in the ED setting.” What’s more, the effectiveness of this paracetamol and ibuprofen combination for moderate to severe pain may also translate to its more widespread use for acute pain in other clinical conditions traditionally treated with opioid medication, however this would need further investigation, the editorial author concluded. Ref: JAMA 2017; 318(17): 1661-1667. Doi:10.1001/jama.2017.16190 JAMA 2017; 318(17) 1655-1656

Dr Linda Calabresi

New US guidelines are the most aggressive yet in terms of targets for blood pressure control. Put out by the American College of Cardiology and the American Heart Association, and published in JAMA, the guidelines recommend we now consider anyone with a BP of 120/80 mmHg or above as having abnormal blood pressure. People who have a systolic between 120 and 130 mmHg but whose diastolic is still below 80 mmHg are to be considered to have elevated BP. But those who have both a systolic up to 10mmHg above target (120-130mmHg) and a diastolic between 80 and 90 mmHg should now be classified as having stage 1 hypertension. An accompanying editorial estimates that this reclassification will result in a 14% increase in the US population who should be recognised as having hypertension. But before clinicians start reaching for the script pad, the guidelines recommend this stage 1 hypertension be initially treated with non-pharmacological therapies – basically addressing the factors that most likely pushed their blood pressure up to start with – lose weight, exercise more, reduce salt intake, cut down on alcohol. The exception to this, is that group of patients whose absolute 10 year CVD risk predictor has them with a 10% or more chance of having a major CV event. In these cases, it’s gloves off. The less than 130/80 target for high risk patients is very similar to Australian guidelines. What’s different is that this is now a recommended target for everyone. The new US guidelines recommend everyone with a BP over 140/90 mmHg be treated with medication (preferably two agents) regardless of their absolute CV risk. Our Heart Foundation says try other lifestyle changes in people with a very low CV risk and no other comorbidities until we reach the 160/100 mmHg mark. The other new development in the US guidelines is the recommendation to use BP measurements from ambulatory or home BP monitoring to both confirm a diagnosis of hypertension and titrate therapy. This is in keeping with Australian recommended practice. The US guidelines were developed by an expert committee after examining all the current evidence and conducting a series of systematic reviews looking at some key clinical questions. “From a public health perspective, considering the high population-attributable risk of CVD associated with hypertension, the potential benefits of tighter control of hypertension are substantial,” the guideline authors wrote. However, they do acknowledge that such an aggressive approach does carry risks, especially in the elderly. “Although studies do suggest that lower BP is better for most patients, including those older than 75 years, the balance of the potential benefits of hypertension management and medication costs, adverse effects, and polypharmacy must be considered for each individual patient,” they said. Ref: JAMA. Published online November 20, 2017. doi:10.1001/jama.2017.18706

Dr Jenny Robson

Schistosomiasis, also known as bilharzia, is the second most prevalent tropical disease after malaria and is a leading cause of morbidity in many parts of the world. It is not uncommon in Australia because of the many travellers who visit endemic areas and swim or bathe in freshwater lakes and streams. Places commonly implicated include Lake Kariba and Lake Malawi in Africa. Immigrants and refugees from bilharzia endemic countries are also likely to present with untreated infection. With increasing travel to and migration from Africa and the Americas knowledge of the dangers and means of avoiding schistosomiasis is essential. Schistosomiasis is caused by trematodes of the genus Schistosoma. The principal schistosomes of medical importance, S japonicum, S mansoni, S mekongi (intestinal schistosomiasis) and S haematobium (urinary schistosomiasis), infect people who enter water in which infected snails (intermediate hosts) are living. The larval cercariae shed by the snail actively penetrate unbroken skin and develop into schistosomulae that migrate through the lungs to the liver where they mature into adults. Female worms lay eggs that pass through the vessels and tissues to the lumen of the gut or bladder (depending on localisation of worms). A proportion of eggs escape from the host and may be found in faeces or urine. The host's immune response to eggs that become lodged in the tissues is largely responsible for disease, Figure 1.  

Geographic distribution

This is governed by the distribution of the intermediate host snail. S haematobium                         Africa, Middle East, India (only Maharashtra) S japonicum                               Philippines, Indonesia (only Sulawesi), parts of China S mansoni                                   Africa, Middle East, some Caribbean Islands, parts of South America (Brazil, Surinam, Venezuela) S mekongi                                   Laos and Cambodia S intercalatum                           10 countries within the rainforest belt of West Africa.

At-risk groups

Owing to the absence of suitable snail hosts, transmission cannot occur in Australia. A history of overseas travel or residence is essential for this diagnosis. Chronic schistosomiasis is more likely to be seen in migrants and refugees from endemic areas. In Australia, where the definitive host is freshwater and marine birds, non-human trematodes may cause schistosomal dermatitis (cercarial dermatitis, swimmer's itch). Onset is usually within 15 minutes of skin contact with cercariae.

Clinical presentation

Disease due to schistosomiasis depends on the infecting species and the intensity of infection. Acute schistosomiasis occurs two to 12 weeks post infection and symptoms last for periods varying from one day to a month or more; recurrence of symptoms 2-3 weeks later is common. Between 40-95% of individuals, not previously exposed to infection, develop symptoms which include fever, malaise, headache, abdominal pain, diarrhoea and urticaria. Many have eosinophilia. After the initial acute onset, most become asymptomatic, although those with S haematobium infections may develop microscopic or macroscopic haematuria. Rare complications result from ectopic deposition of eggs in the spinal cord and brain. Most travellers are only mildly infected and are therefore often asymptomatic and unlikely to develop the severe manifestations of chronic schistosomiasis. Severe disease occurs in patients with heavy and prolonged infection. Hepatosplenomegaly, portal hypertension, ascites and oesophageal varices may result from intestinal schistosomiasis. And frank haematuria with varying degrees of impairment of the urinary bladder and ureters may occur with S haematobium infections.

Diagnosis

The prepatent period of S japonicum, S mansoni and S mekongi is 6-8 weeks, and for S. haematobium 10-12 weeks. Examination of faeces or urine before this time often yields false negative results. Similarly, with serology, testing too early may result in false negative results. Antibody development occurs slightly before eggs are detected. Eosinophilia (greater than 0.60 x103/mL) is present in up to 80% of patients with infections; however, its absence does not exclude infection.

Parasitologic examination

Diagnosis is by demonstration of eggs of S japonicum, S mansoni and S mekongi  in faeces, or eggs of S haematobium in urine. At least two stool or urine specimens should be submitted for examination over a period of 10 days. Whilst eggs may be found in all specimens of urine, there is some evidence of a diurnal periodicity with a peak of excretion around midday. Collection of the terminal portion of urine collected between noon and 2 pm is therefore recommended. Schistosome eggs can also be demonstrated in rectal snips or bladder biopsies. Viability of eggs can be assessed if the biopsies are received fresh.

Serologic examination

At our laboratory, antibodies are detected by enzyme immunoassay (EIA) using purified egg S mansoni antigen. Antibodies to this antigen may be undetectable in the pre-patent period lasting 8-10 weeks. The test detects genus specific antibodies. In the absence of a diagnosis based on egg identification, travel history provides the best assessment of likely species.

Interpretation

Parasitologic Faeces is concentrated (modified formalin-ethyl acetate) and urine either centrifuged or filtered; all of the concentrate or sediment is examined. Because of the low sensitivity of these techniques, a negative faecal or urine examination does not exclude schistosomiasis. Microscopic examination of eggs enables the species of parasite to be determined. At least two examinations on different days are recommended. Serologic Schistosome serology cannot distinguish between past or current infection nor differentiate the species of infection. Clinical history and further investigations should be considered when establishing the diagnosis. Recent infections may be serologically negative.

Preventative measures

Cercariae can burrow through the mucosa of the mouth as well as through unbroken skin. All fresh water in endemic areas should be considered suspect, although snails tend to live in slow-flowing and stagnant waters, rather than in rapids and fast-flowing waters. If freshwater contact is unavoidable, bathing water should be heated to 50°C for five minutes or treated with iodine or chlorine as for the treatment of drinking water. Water can also be strained through paper filters, or allowed to stand for 2-3 days before use. This exceeds the usual life span of the cercariae. Of course, the container must be kept free of snails. High waterproof boots or hip waders are recommended if wading through streams or swamps. It is wise to carry a pair of rubber gloves to protect hands when contact with fresh water is anticipated. Vigorous towel drying, and rubbing alcohol on exposed skin immediately after contact with untreated water, may also help reduce cercarial penetration. Vegetables should be well cooked and salads avoided as these may have been washed in infected water, allowing cercariae to attach themselves to the leaves.

Treatment

Praziquantel (Biltricide) 20 mg/kg bodyweight every four hours for 2-3 doses depending upon the species is recommended. In travellers, this is likely to achieve cure rates in the order of 90%. Tablets are scored and available as a 600mg dose dispensed six per pack. In patients at risk of chronic disease, such as refugees and migrants, it is important to be aware of complications that may arise from chronic infection: liver fibrosis, portal hypertension and its sequelae, and colorectal malignancy in the intestinal forms; obstructive uropathy, superimposed bacterial infection, infertility and possibly bladder cancer.

Follow-up

Follow-up schistosomiasis serology is recommended in 12 to 36 months after treatment. Follow-up serology may differ between immigrants and returned travellers. Travellers may show a more rapid serological decline post-treatment due to a shorter duration of infection and lower parasite burden. Immigrants may even show a rise in titre within the first 6-12 months post-treatment. Persisting titres should not automatically justify retreatment, this should be based on symptoms, parasite identification or eosinophilia. Viable eggs may continue to be excreted for up to one month after successful treatment. Non-viable and degenerate eggs can be found in tissue biopsies for years after infection has occurred.
General Practice Pathology is a new regular column each authored by an Australian expert pathologist on a topic of particular relevance and interest to practising GPs. The authors provide this editorial, free of charge as part of an educational initiative developed and coordinated by Sonic Pathology.
Dr Linda Calabresi

All pregnant women who are smokers should be offered nicotine replacement therapy (NRT) as an option to help them quit, Australian researchers say. In a review published in the MJA, authors said that even though there was a general acknowledgement that there was no firm evidence that proved NRT was safe or effective in pregnancy, all the current guidelines recommend its use for women who couldn’t quit without medication. In a nutshell, NRT is safer than smoking, and smoking during pregnancy is the most important preventable risk factor for poor maternal and infant health outcomes, they said. Despite this, there appears a reluctance among doctors, both here and around the world to prescribe the therapy to pregnant women. The researchers cited a recent survey of Australian GPs and obstetricians that found one in four said they never prescribed NRT in pregnancy. One possible reason for this reluctance, they suggest is the caveats and cautions included in many of the guidelines. Phrases such as ‘only if women are motivated’, ‘only give out two weeks’ supply’ and ‘under close supervision’ hardly inspire confidence in the safety of the therapy. “Ambiguous messages may be contributing to the low NRT prescribing rates and, therefore, it is important to provide a clear practical message to health practitioners and women,” they said. After analysing the various guidelines, the researchers suggest using the strength of the urge to smoke as well as how frequently the urge to smoke occurs to help assess when a woman needs to start or increase the dose of their NRT. “The most important guidance for NRT in pregnancy is to use the lowest possible dose that is effective. However, to be effective, women should be instructed to use as much as needed to deal with cravings,” they advised. They also recommend women be encouraged to use NRT for at least 12 weeks or longer if required to ensure they don’t relapse. All smokers who are pregnant should be told “There is nothing better for you and your baby’s health than to quit smoking.” Ref: MJA  Online first 4.12.17 doi:10.5694/mja17.00446