This study examined the association of diabetes with the onset of dementia (including Alzheimer's disease (AD), vascular dementia (VD) and any dementia) and mild cognitive impairment (MCI) by using a quantitative meta‐analysis of longitudinal studies. EMBASE and MEDLINE were searched for articles published up to December 2010. All studies that examined the relationship between diabetes and the onset of dementia or MCI were included. Pooled relative risks were calculated using fixed and random effects models. Nineteen studies met our inclusion criteria for this meta‐analysis, and 6184 subjects with diabetes and 38 530 subjects without diabetes were included respectively. All subjects were without dementia or MCI at baseline. The quantitative meta‐analysis showed that subjects with diabetes had higher risk for AD (relative risk (RR):1.46, 95% confidence interval (CI): 1.20–1.77), VD (RR: 2.48, 95% CI: 2.08–2.96), any dementia (RR: 1.51, 95% CI: 1.31–1.74) and MCI (RR: 1.21, 95% CI: 1.02–1.45) than those without. The quantitative meta‐analysis showed that diabetes was a risk factor for incident dementia (including AD, VD and any dementia) and MCI.
New oral anticoagulants ( NOAC ) are becoming available as alternatives to warfarin to prevent systemic embolism in patients with non‐valvular atrial fibrillation and for the treatment and prevention of venous thromboembolism. An in‐depth understanding of their pharmacology is invaluable for appropriate prescription and optimal management of patients receiving these drugs should unexpected complications (such as bleeding) occur, or the patient requires urgent surgery. The Australasian S ociety of T hrombosis and H aemostasis has set out to inform physicians on the use of the different NOAC based on current available evidence focusing on: (i) selection of the most suitable patient groups to receive NOAC , (ii) laboratory measurements of NOAC in appropriate circumstances and (iii) management of patients taking NOAC in the perioperative period, and strategies to manage bleeding complications or ‘reverse’ the anticoagulant effects for urgent invasive procedures.
Background: Measuring serum 25(OH)D concentration is common in clinical practice despite the questionable reliability of assays. Aims: The aim of the present study was to examine agreement in 25(OH)D concentrations measured by different assays and laboratories, and consider related clinical implications. Methods: Serum samples from 813 participants in the Australian Multicentre Study of Environment and Immune Function (the Ausimmune Study) were assayed for 25(OH)D concentration. Duplicate samples from subsets of subjects were sent to different laboratories, two using DiaSorin Liaison (Laboratory A and B) and one using Liquid Chromatography‐Tandem Mass Spectrometry (LC‐MS/MS – selected here as the nominal gold standard). Pairwise within‐assay (both within‐laboratory and between‐laboratories) and between‐assay agreement was examined using Deming regression and Bland‐Altman plots. Common 25(OH)D cut‐points for classification of vitamin D deficiency were used to compare the different assays. Results: 25(OH)D concentrations measured using Liaison were substantially lower at Laboratory A than at Laboratory B (mean bias −11.60 nmol/L, 95% limits of agreement −46.39, 23.18). Both Liaison assays returned much lower 25(OH)D concentrations than LC‐MS/MS (mean bias up to −26.05 nmol/L, 95% limits of agreement of −13.21, 65.31). For Laboratory A participants, 46% (355/765) were classified as vitamin D deficient (25(OH)D <50 nmol/L) using Liaison compared with 17% (128/765) using LC‐MS/MS. For Laboratory B participants, the respective figures were 36% (76/209) and 20% (41/209). Hence, between 1‐in‐5 and 1‐in‐3 participants were misclassified as ‘deficient’. Conclusion: Bias and variability in 25(OH)D measurements sufficient to affect significantly clinical decision‐making were found both between‐laboratories and between‐assays. The adoption of common standards to allow assay calibration is required urgently.
It is generally accepted that the prevalence of food allergy has been increasing in recent decades, particularly in westernised countries, yet high‐quality evidence that is based on challenge confirmed diagnosis of food allergy to support this assumption is lacking because of the high cost and potential risks associated with conducting food challenges in large populations. Accepting this caveat, the use of surrogate markers for diagnosis of food allergy (such as nationwide data on hospital admissions for food anaphylaxis or clinical history in combination with allergen‐specific IgE ( sIgE ) measurement in population‐based cohorts) has provided consistent evidence for increasing prevalence of food allergy at least in western countries, such as the UK , United States and Australia. Recent reports that children of East Asian or African ethnicity who are raised in a western environment (Australia and United States respectively) have an increased risk of developing food allergy compared with resident Caucasian children suggest that food allergy might also increase across Asian and African countries as their economies grow and populations adopt a more westernised lifestyle. Given that many cases of food allergy persist, mathematical principles would predict a continued increase in food allergy prevalence in the short to medium term until such time as an effective treatment is identified to allow the rate of disease resolution to be equal to or greater than the rate of new cases.
Background: The prognostic significance of various systemic inflammation-based markers has been explored in different cancers. These markers can be used to assist with decision-making in oncology clinics. Aim: The aim of this study was to investigate the prognostic significance of three systemic inflammation-based factors: neutrophil-lymphocyte ratio (NLR), plateletlymphocyte ratio (PLR) and modified Glasgow Prognostic Score (mGPS) in patients with advanced pancreatic cancer. Methods: Data were collected retrospectively for advanced pancreatic cancer patients treated between 1 January 2008 and 31 December 2012 at the Royal Perth Hospital. The ratios were dichotomised as = 5 for NLR and = 200 for PLR. Modified Glasgow Prognostic Scores were scored as: mGPS '0' = both C-reactive protein (CRP) and albumin normal, mGPS '1' = elevated CRP 10 mg/L and albumin = 5 = 8.5 months versus 2.6 months respectively (P = 0.0007; hazard ratio (HR) 1.81), PLR < 200 versus = 200 = 9.1 months versus 4 months respectively (P = 0.007; HR 1.64) and mGPS score 1, 2, 3 = 8.3 months, 9.6 months and 1.8 months respectively (P = 0.0004). Besides Eastern Cooperative Oncology Group performance status, NLR, PLR and mGPS were significant independent prognostic markers both on univariate as well as multivariate analysis. Conclusions: Our findings suggest that the NLR, PLR and mGPS derived from routine blood tests can be used as clinically meaningful biomarkers to stratify advanced pancreatic cancer patients into different prognostic groups.
The incidence of Clostridium difficile infection ( CDI ) continues to rise, whilst treatment remains problematic due to recurrent, refractory and potentially severe nature of disease. The treatment of C. difficile is a challenge for community and hospital‐based clinicians. With the advent of an expanding therapeutic arsenal against C. difficile since the last published Australasian guidelines, an update on CDI treatment recommendations for Australasian clinicians was required. On behalf of the Australasian Society of Infectious Diseases, we present the updated guidelines for the management of CDI in adults and children.
Aims: To investigate the changes in polypharmacy and the drug burden index (DBI) occurring during hospitalisation for older people. The secondary aim was to examine the associations of these two measures with the length of hospital stay and admission for falls or delirium. Methods: A retrospective analysis of patients' medical records was undertaken at a large university teaching hospital (Sydney, Australia) for patients with the age of >= 65 years and admitted under the care of the geriatric medicine or rehabilitation teams. Polypharmacy was defined as the use of more than five regular medications. The DBI measures exposure to drugs with anticholinergic and sedative effects. Logistic regression analysis was conducted to investigate the associations between polypharmacy and DBI with outcome measures. Data are presented using odds ratios with 95% confidence intervals. Results: A total of 329 patients was included in this study. The mean (+/- standard deviation) age of the population was 84.6 +/- 7.0 years, 62% were female and 40% were admitted from residential aged-care facilities. On admission, polypharmacy was observed in 60% of the cohort and DBI exposure for 50%. DBI and polypharmacy exposure decreased during hospitalisation, but only the number of medications taken decreased by a statistically significant margin (P = 0.02). Patients with a high DBI (= 1) were approximately three times more likely to be admitted for delirium than those with no DBI exposure (odds ratio, 2.95; 95% confidence interval, 1.34-6.51). Conclusions: In the present study, DBI was associated with an increased risk of hospital admission for delirium only. Polypharmacy was not associated with any of the clinical measures.
BackgroundAdvance care planning is regarded as integral to better patient outcomes, yet little is known about the prevalence of advance directives (AD) in Australia. AimTo determine the prevalence of AD in the Australian population. MethodsA national telephone survey about estate and advance planning. Sample was stratified by age (18-45 and >45 years) and quota sampling occurred based on population size in each state and territory. ResultsFourteen per cent of the Australian population has an AD. There is state variation with people from South Australia and Queensland more likely to have an AD than people from other states. Will making and particularly completion of a financial enduring power of attorney are associated with higher rates of AD completion. Standard demographic variables were of limited use in predicting whether a person would have an AD. ConclusionsDespite efforts to improve uptake of advance care planning (including AD), barriers remain. One likely trigger for completing an AD and advance care planning is undertaking a wider future planning process (e.g. making a will or financial enduring power of attorney). This presents opportunities to increase advance care planning, but steps are needed to ensure that planning, which occurs outside the health system, is sufficiently informed and supported by health information so that it is useful in the clinical setting. Variations by state could also suggest that redesign of regulatory frameworks (such as a user-friendly and well-publicised form backed by statute) may help improve uptake of AD.
Background: Severe asthma is a high impact disease. Omalizumab targets the allergic inflammatory pathway; however, effectiveness data in a population with significant comorbidities are limited. Aims: To describe severe allergic asthma, omalizumab treatment outcomes and predictors of response among the Australian Xolair Registry participants. Methods: A web-based post-marketing surveillance registry was established to characterise the use, effectiveness and adverse effects of omalizumab (Xolair) for severe allergic asthma. Results: Participants (n = 192) (mean age 51 years, 118 female) with severe allergic asthma from 21 clinics in Australia were assessed, and 180 received omalizumab therapy. They had poor asthma control (Asthma Control Questionnaire, ACQ-5, mean score 3.56) and significant quality of life impairment (Asthma-related Quality of Life Questionnaire score 3.57), and 52% were using daily oral corticosteroid (OCS). Overall, 95% had one or more comorbidities (rhinitis 48%, obesity 45%, cardiovascular disease 23%). The omalizumab responder rate, assessed by an improvement of at least 0.5 in ACQ-5, was high at 83%. OCS use was significantly reduced. The response in participants with comorbid obesity and cardiovascular disease was similar to those without these conditions. Baseline ACQ-5 = 2.0 (P = 0.002) and older age (P = 0.05) predicted the magnitude of change in ACQ-5 in response to omalizumab. Drug-related adverse events included anaphylactoid reactions (n = 4), headache (n = 2) and chest pains (n = 1). Conclusion: Australian patients with severe allergic asthma report a high disease burden and have extensive comorbidity. Symptomatic response to omalizumab was high despite significant comorbid disease. Omalizumab is an effective targeted therapy for severe allergic asthma with comorbidity in a real-life setting.
Background: The Barwon area in Australia has one of the highest incidence rates of inflammatory bowel disease (IBD) and therefore is an ideal location to study the impact of environmental exposures on the disease's development. Aim: To study these exposures prior to the development of IBD in a population-based cohort. Method: One hundred and thirty-two incident cases (81 Crohn disease (CD) and 51 ulcerative colitis (UC)) from an IBD registry and 104 controls replied to the International Organization of Inflammatory Bowel Diseases environmental questionnaire. This included 87 questions about pre-illness exposures that included childhood illnesses, vaccinations, breastfeeding, house amenities, pets and swimming, diet and smoking. Results: The factors associated with CD included smoking (odds ratio (OR): 1.42, confidence interval (CI): 1-2.02, P = 0.029); childhood events, including tonsillectomy (OR: 1.74, CI: 1.15-2.6, P = 0.003) and chicken pox infection (OR: 3.89, CI: 1.61-9.4, P = 0.005) and pre-diagnosis intake of frequent fast food (OR: 2.26, CI: 1.76-4.33, P = 0.003). In UC, the risk factors included smoking (OR: 1.39, CI: 1.1-1.92, P = 0.026) and pre-diagnosis intake of frequent fast food (OR: 2.91, CI: 1.54-5.58, P < 0.001), and high caffeine intake was protective (OR: 0.51, 95% CI: 0.3-0.87, P = 0.002). Other protective exposures for UC included high fruit intake (OR: 0.59, CI: 0.4-0.88, P = 0.003) and having pets as a child (OR: 0.36, CI: 0.2-0.79, P = 0.001). Conclusion: This first Australian population-based study of environmental risk factors confirms that smoking, childhood immunological events and dietary factors play a role in IBD development; while high caffeine intake and pet ownership offer a protective effect.
BackgroundPrognosis for patients with malignant' or space-occupying oedema post middle cerebral artery infarct remains poor despite maximal medical therapy delivered in the intensive care setting. AimWe performed a meta-analysis to evaluate the value of surgical decompression versus medical management alone in patients suffering from malignant middle cerebral artery infarct. MethodsA systematic search was conducted using MEDLINE, PubMed, EMBASE, Current Contents Connect, Cochrane library, Google Scholar, Science Direct and Web of Science. Original data was abstracted from each study and used to calculate a pooled odds ratio (OR) and 95% confidence interval (95% CI). ResultsThe overall OR for mRS 6 (death) at 6 months for decompressive surgery as compared with standard medical management revealed a statistically significant reduction with OR of 0.19 (95% CI: 0.10-0.37). The frequency of patients with mRS 2, 3 and 5 outcomes was higher in the decompressive surgery cohort; however, these outcomes did not reach statistical significance. On the other hand, the number of patients with a mRS score of 4 was significantly higher in the decompressive surgery cohort with an OR of 3.29 (95% CI: 1.76-6.13). The overall OR for mRS 6 (death) at 12 months for decompressive surgery as compared with standard medical management revealed a statistically significant reduction with OR of 0.17 (95% CI: 0.10-0.29). The frequency of patients with mRS 3 and 5 outcomes was higher in the decompressive surgery cohort; however, these outcomes did not reach statistical significance. On the other hand, the number of patients with a mRS score of 4 was significantly higher in the decompressive surgery cohort with an OR of 4.43 (95% CI: 2.27-8.66). In the long run it was also observed that the number of patients with a mRS score of 2 was significantly higher in the decompressive surgery cohort an OR of 4.51 (95% CI: 1.06-19.24). ConclusionsOur results imply that surgical intervention decreased mortality of patients with fatal middle cerebral artery infarct at the expense of increasing the proportion suffering from substantial disability at the conclusion of follow up.
Background: Performance of linear probe endobronchial ultrasound‐guided transbronchial needle aspiration (EBUS‐TBNA) for staging non‐small‐cell lung cancer has been extensively studied. Alternate indications for its use are less well characterised, and performance in other clinical settings may differ. Methods: We examined a prospectively collected cohort comprising the first 215 patients undergoing EBUS‐TBNA at our institution. Patients were analysed according to the clinical and radiological indication for referral. We also examined the effect of the procedural learning curve on diagnostic sensitivity. Results: A total of 215 patients underwent 216 EBUS‐TBNA procedures. EBUS‐TBNA returned adequate tissue for cytopathological analysis in 202 of 216 procedures (94%). Overall sensitivity for detection of malignancy was 0.92 (95% confidence interval 0.86–0.96); however, this varied according to the primary indication for EBUS‐TBNA. Diagnostic sensitivity was high among all sub‐groups, but the negative predictive value varied depending on the clinical indication for the procedure. We estimate 104 invasive surgical procedures and 32 inpatient admissions were avoided by use of EBUS‐TBNA. Significant improvement in diagnostic performance was seen after 20 procedures were completed, and diagnostic accuracy did not peak until after 50 procedures. Conclusions: EBUS‐TBNA is able to confirm accurately histologically a large number of disease processes, both malignant and benign, in all clinical indications studied. The procedure is safe even when carried out by proceduralists with minimal prior experience. Diagnostic performance continues to improve beyond 50 cases carried out.
Background: Endovascular thrombectomy (EVT) for management of large vessel occlusion (LVO) acute ischaemic stroke is now current best practice. Aim: To determine if bridging intravenous (i. v.) alteplase therapy confers any clinical benefit. Methods: A retrospective study of patients treated with EVT for LVO was performed. Outcomes were compared between patients receiving thrombolysis and EVT with EVT alone. Primary end-points were reperfusion rate, 90-day functional outcome and mortality using the modified Rankin Scale (mRS) and symptomatic intracranial haemorrhage (sICH). Results: A total of 355 patients who underwent EVT was included: 210 with thrombolysis (59%) and 145 without (41%). The reperfusion rate was higher in the group receiving i. v. tissue plasminogen activator (tPA) (unadjusted odds ratio (OR) 2.2, 95% confidence interval (CI): 1.29-3.73, P = 0.004), although this effect was attenuated when all variables were considered (adjusted OR (AOR) 1.22, 95% CI: 0.60-2.5, P = 0.580). The percentage achieving functional independence (mRS 0-2) at 90 days was higher in patients who received bridging i. v. tPA (AOR 2.17, 95% CI: 1.06-4.44, P = 0.033). There was no significant difference in major complications, including sICH (AOR 1.4, 95% CI: 0.51-3.83, P = 0.512). There was lower 90-day mortality in the bridging i. v. tPA group (AOR 0.79, 95% CI: 0.36-1.74, P = 0.551). Fewer thrombectomy passes (2 versus 3, P = 0.012) were required to achieve successful reperfusion in the i. v. tPA group. Successful reperfusion (modified thrombolysis in cerebral infarction = 2b) was the strongest predictor for 90-day functional independence (AOR 10.4, 95% CI: 3.6-29.7, P < 0.001). Conclusion: Our study supports the current practice of administering i. v. alteplase before endovascular therapy.
BACKGROUNDIn Australia, one third of HIV diagnoses occur late, with an estimated 11% of people with HIV unaware of their diagnosis. Undiagnosed and untreated HIV infection increases morbidity in the HIV positive person and allows onward transmission of HIV.AIMTo determine the rate of HIV testing in acute general medicine patients with HIV Indicator Conditions (ICs) and evaluate the effectiveness of an educational intervention in improving testing rates.METHODSSingle-centre, tertiary hospital, before-after study of general medicine inpatients with ICs for 12 weeks prior and 10 weeks post an educational intervention focusing on recommendations for HIV testing including ICs. The REASON Cohort Discovery Tool was used to search for the ICs using ICD-10 codes and laboratory data. The presence of ICs was estimated, and HIV testing rates before and after the intervention were compared. Regression analysis was utilised to identify characteristics associated with HIV testing.RESULTSOf 1414 admissions in the baseline period and 946 in the post-period, 161 (11.4%) and 132 (14.0%) had at least one IC present respectively. There were 18 (11.2%) HIV tests performed for admissions with ICs in the pre-period which increased to 27 (20.5%) (p=0.028) in the post-period. Younger patients were more likely to be tested and regression analysis identified the educational intervention (aOR) 2.2 (1.1, 4.4)) to be significantly associated with testing.CONCLUSIONSAlthough HIV testing rates for ICs doubled following the intervention, they remained unacceptably low. The recently introduced electronic medical record presents opportunities to prompt HIV testing. This article is protected by copyright. All rights reserved.
BACKGROUNDThe benefit of extended duration thromboprophylaxis in patients hospitalized for acute medical illness beyond hospital stay remains controversial.AIMTo perform a meta-analysis of randomized controlled trial (RCTs) in order to examine the efficacy and safety of extended-duration anticoagulation for venous-thromboembolism (VTE) prophylaxis in this high-risk population.METHODSAn electronic database search was conducted to include all RCTs comparing between extended-duration versus short-duration prophylactic anticoagulation in medically ill patients. The primary efficacy outcome was the composite events of asymptomatic deep vein thrombosis (DVT), symptomatic VTE, and death from VTE-related causes.RESULTSFive RCTs were included totaling 40,124 patients with a mean age of 71 years and 51% were male. In comparison to standard-duration therapy, extended-duration thromboprophylaxis was associated with a significant reduction in the primary efficacy outcome (RR 0.75; 95% CI 0.67-0.85; P<0.01), symptomatic VTE (RR 0.53; 95 % CI 0.33-0.84; P<0.01), and asymptomatic DVT (RR 0.81; 95% CI 0.71-0.94; P<0.01). However, there were no significant differences between both groups with regard to VTE-related death (RR 0.81; 95% CI 0.60-1.10; P=0.18) or all-cause death (RR 0.97; 95% CI 0.88-1.08; P=0.64). In contrast, extended-duration thromboprophylaxis was associated with an increased risk of major bleeding (RR 2.04; 95% CI 1.42-2.91; P<0.01) and non-major clinically relevant bleeding (RR 1.81; 95% CI 1.29-2.53; P<0.01).CONCLUSIONSAmong hospitalized medically ill patients, prolonging venous thromboprophylaxis was associated with a decreased risk of composite events of the primary efficacy outcome and increased risk of bleeding with no significant difference in VTE-related death. This article is protected by copyright. All rights reserved.
BACKGROUNDDonor safety is paramount when performing bone marrow stem cell harvest. The incidence of full blood count (FBC) abnormalities among donors and variables associated with anaemia after marrow harvest are not well established.AIMSTo describe to frequency of FBC abnormalities prior to bone marrow stem cell harvest and to identify variable associated with post harvest anaemia.METHODSOutcomes of 80 consecutive adult marrow harvests performed at our center were analyzed retrospectively.RESULTSFBC abnormalities were present in 28% of donors prior to marrow harvest with normocytic anaemia the most common abnormality in 13%. Reduced donor Hb was independently correlated with lower CD34+ cell count per kg of recipient body weight. Anaemia (Hb<100g/L) was seen in 20% of donors after harvest with median decrease in Hb of 19g/L. Variables independently associated with anaemia after harvest included donor to recipient weight ratio (p=0.011), high collection volume (p=0.044) and female gender (p=0.023). Total nucleated cell and CD34 concentration in the final collected product were associated with the inverse of harvested marrow volume (p<0.001).CONCLUSIONSPre-harvest anaemia should be corrected where possible particularly in female donors. Marrow collection volume should be minimized to reduce post-harvest anaemia, optimize CD34+ cell number and improve nucleated and stem cell concentrations in the harvest product. This article is protected by copyright. All rights reserved.
An increasing prevalence of diabetes mellitus has led to a high risk of diabetic foot infections (DFI) and associated morbidity. However, little is known about the relationship between DFI and mortality. To investigate the risk of mortality and associated factors in patients with DFI in an Australian context. A prospective cohort study of inpatients with DFI between May 2012 and October 2016, at Royal Darwin Hospital, a tertiary referral hospital for the Top End of the Northern Territory. Primary outcome was one-year mortality with Cox regression analysis undertaken to assess risk factors for mortality. 413 consecutive adult diabetic patients with 737 admissions were referred to the High-Risk Foot Service for DFI. Cumulative risk of mortality at one year was 8.9% (95%CI 6.4-12.2). On univariable analysis, mortality was associated with older age (hazard ratio [HR] per year increase 1.08, 95%CI 1.06-1.11, p=0.001), haemodialysis (HR 3.64, 1.74-7.62, p7% (HR 0.45, 0.20-0.99, p<0.05) were protective. After adjusting for confounders, independent risk factors for mortality were haemodialysis (AHR 5.76, 95%CI 2.28-14.59, p<0.001) and older age (AHR 1.09, 1.06-1.13, p<0.001). Patients on haemodialysis had a cumulative risk of mortality of 24.5% (95%CI 14.0-40.8) at one year. There is a high risk of mortality associated with DFI, substantially increased in patients undergoing haemodialysis, highlighting the importance of early and dedicated interventions targeted at this high-risk group. This article is protected by copyright. All rights reserved.
The delivery of healthcare, which includes the informed consent process, is moving to a digital environment. This change in informed consent delivery will be associated with opportunities, risks and also unintentional consequences. Physicians are well placed to contribute to the ongoing dialogue about what is needed to make the informed consent process fit for purpose, in the digital age.
Knowledge about patients with Acute Liver Failure (ALF) in Australia and New Zealand (ANZ) is lacking. We hypothesised that the pattern of disease would be similar to previous studies and that, despite low transplantation rates, mortality would be comparable. We obtained data from the ANZ Intensive Care Society Adult Patient Database and the ANZ Liver Transplant Registry for ten years commencing 2005 and analysed for patient outcomes. During the study period, 1 022 698 adults were admitted to intensive care units (ICUs) across ANZ, of which 723 had ALF. The estimated annual incidence of ALF over this period was 3.4/million people and increased over time (p=0.001). ALF patients had high illness severity (APACHE III 79.8 vs. 50.1 in non-ALF patients; p<0.0001), and were more likely to be younger, female, pregnant and immunosuppressed. ALF was an independent predictor of mortality (OR 1.5 (1.26-1.79); p<0.0001). At less than 23%, the use of liver transplantation was low, but the mortality of 39% was similar to previous studies. ALF is a rare but increasing diagnosis in ANZ ICUs. Low transplantation rates in ANZ for ALF do not appear to be associated with higher mortality rates than reported in the literature. This article is protected by copyright. All rights reserved.