METHODS: This was a single-center, retrospective study. Echocardiographic assessment of the LV geometry, mass, and free wall thickness was performed before stenting and before the arterial switch operation. Patients then underwent the arterial switch operation, and the postoperative outcomes were reviewed.
RESULTS: There were 11 consecutive patients (male, 81.8%; mean age at stenting, 43.11 ± 18.19 days) with TGA-IVS with involuted LV who underwent LV retraining by ductal stenting from July 2013 to December 2017. Retraining by ductus stenting failed in 4 patients (36.3%). Two patients required pulmonary artery banding, and another 2 had an LV mass index of less than 35 g/m2. Patients in the successful group had improved LV mass index from 45.14 ± 17.91 to 81.86 ± 33.11g/m2 (p = 0.023) compared with 34.50 ± 10.47 to 20.50 ± 9.88 g/m2 (p = 0.169) and improved LV geometry after ductal stenting. The failed group was associated with an increased need for extracorporeal support (14.5% vs 50%, p = 0.012). An atrial septal defect-to-interatrial septum length ratio of more than 0.38 was associated with failed LV retraining.
CONCLUSIONS: Ductal stenting is an effective method to retrain the involuted LV in TGA-IVS. A large atrial septal defect (atrial septal defect-to-interatrial septum length ratio >0.38) was associated with poor response to LV retraining.
METHODS: DNA methylation profiling was utilized to screen the differentially hypermethylated genes in OSCC. Three selected differentially-hypermethylated genes of p16, DDAH2 and DUSP1 were further validated for methylation status and protein expression. The correlation between demographic, clinicopathological characteristics, and survival rate of OSCC patients with hypermethylation of p16, DDAH2 and DUSP1 genes were analysed in the study.
RESULTS: Methylation profiling demonstrated 33 promoter hypermethylated genes in OSCC. The differentially-hypermethylated genes of p16, DDAH2 and DUSP1 revealed positivity of 78%, 80% and 88% in methylation-specific polymerase chain reaction and 24% and 22% of immunoreactivity in DDAH2 and DUSP1 genes, respectively. Promoter hypermethylation of p16 gene was found significantly associated with tumour site of buccal, gum, tongue and lip (P=0.001). In addition, DDAH2 methylation level was correlated significantly with patients' age (P=0.050). In this study, overall five-year survival rate was 38.1% for OSCC patients and was influenced by sex difference.
CONCLUSIONS: The study has identified 33 promoter hypermethylated genes that were significantly silenced in OSCC, which might be involved in an important mechanism in oral carcinogenesis. Our approaches revealed signature candidates of differentially hypermethylated genes of DDAH2 and DUSP1 which can be further developed as potential biomarkers for OSCC as diagnostic, prognostic and therapeutic targets in the future.
METHODOLOGY: This was a retrospective analysis of all OHCA cases collected from the Pan-Asian Resuscitation Outcomes Study (PAROS) registry in 7 countries in Asia between 2009 and 2012. We included OHCA cases of presumed cardiac etiology, aged 18-years and above and resuscitation attempted by EMS. We performed multivariate logistic regression analyses to assess the relationship between initial and subsequent shockable rhythm and survival and neurological outcomes. 2-stage seemingly unrelated bivariate probit models were developed to jointly model the survival and neurological outcomes. We adjusted for the clustering effects of country variance in all models.
RESULTS: 40,160 OHCA cases met the inclusion criteria. There were 5356 OHCA cases (13.3%) with initial shockable rhythm and 33,974 (84.7%) with initial non-shockable rhythm. After adjustment of baseline and prehospital characteristics, OHCA with initial shockable rhythm (odds ratio/OR=6.10, 95% confidence interval/CI=5.06-7.34) and subsequent conversion to shockable rhythm (OR=2.00,95%CI=1.10-3.65) independently predicted better survival-to-hospital-discharge outcomes. Subsequent shockable rhythm conversion significantly improved survival-to-admission, discharge and post-arrest overall and cerebral performance outcomes in the multivariate logistic regression and 2-stage analyses.
CONCLUSION: Initial shockable rhythm was the strongest predictor for survival. However, conversion to subsequent shockable rhythm significantly improved post-arrest survival and neurological outcomes. This study suggests the importance of early resuscitation efforts even for initially non-shockable rhythms which has prognostic implications and selection of subsequent post-resuscitation therapy.
PATIENTS AND METHODS: Sixty-two patients with AML excluding acute promyelocytic leukemia were retrospectively analyzed. Patients in the earlier cohort (n = 36) were treated on the Medical Research Council (MRC) AML12 protocol, whereas those in the recent cohort (n = 26) were treated on the Malaysia-Singapore AML protocol (MASPORE 2006), which differed in terms of risk group stratification, cumulative anthracycline dose, and timing of hematopoietic stem-cell transplantation for high-risk patients.
RESULTS: Significant improvements in 10-year overall survival and event-free survival were observed in patients treated with the recent MASPORE 2006 protocol compared to the earlier MRC AML12 protocol (overall survival: 88.0% ± 6.5% vs 50.1% ± 8.6%, P = .002; event-free survival: 72.1% ± 9.0 vs 50.1% ± 8.6%, P = .045). In univariate analysis, patients in the recent cohort had significantly lower intensive care unit admission rate (11.5% vs 47.2%, P = .005) and numerically lower relapse rate (26.9% vs 50.0%, P = .068) compared to the earlier cohort. Multivariate analysis showed that treatment protocol was the only independent predictive factor for overall survival (hazard ratio = 0.21; 95% confidence interval, 0.06-0.73, P = .014).
CONCLUSION: Outcomes of pediatric AML patients have improved over time. The more recent MASPORE 2006 protocol led to significant improvement in long-term survival rates and reduction in intensive care unit admission rate.
METHODS: The Prospective Urban Rural Epidemiology (PURE) study is a large, epidemiological cohort study of individuals aged 35-70 years (enrolled between Jan 1, 2003, and March 31, 2013) in 18 countries with a median follow-up of 7·4 years (IQR 5·3-9·3). Dietary intake of 135 335 individuals was recorded using validated food frequency questionnaires. The primary outcomes were total mortality and major cardiovascular events (fatal cardiovascular disease, non-fatal myocardial infarction, stroke, and heart failure). Secondary outcomes were all myocardial infarctions, stroke, cardiovascular disease mortality, and non-cardiovascular disease mortality. Participants were categorised into quintiles of nutrient intake (carbohydrate, fats, and protein) based on percentage of energy provided by nutrients. We assessed the associations between consumption of carbohydrate, total fat, and each type of fat with cardiovascular disease and total mortality. We calculated hazard ratios (HRs) using a multivariable Cox frailty model with random intercepts to account for centre clustering.
FINDINGS: During follow-up, we documented 5796 deaths and 4784 major cardiovascular disease events. Higher carbohydrate intake was associated with an increased risk of total mortality (highest [quintile 5] vs lowest quintile [quintile 1] category, HR 1·28 [95% CI 1·12-1·46], ptrend=0·0001) but not with the risk of cardiovascular disease or cardiovascular disease mortality. Intake of total fat and each type of fat was associated with lower risk of total mortality (quintile 5 vs quintile 1, total fat: HR 0·77 [95% CI 0·67-0·87], ptrend<0·0001; saturated fat, HR 0·86 [0·76-0·99], ptrend=0·0088; monounsaturated fat: HR 0·81 [0·71-0·92], ptrend<0·0001; and polyunsaturated fat: HR 0·80 [0·71-0·89], ptrend<0·0001). Higher saturated fat intake was associated with lower risk of stroke (quintile 5 vs quintile 1, HR 0·79 [95% CI 0·64-0·98], ptrend=0·0498). Total fat and saturated and unsaturated fats were not significantly associated with risk of myocardial infarction or cardiovascular disease mortality.
INTERPRETATION: High carbohydrate intake was associated with higher risk of total mortality, whereas total fat and individual types of fat were related to lower total mortality. Total fat and types of fat were not associated with cardiovascular disease, myocardial infarction, or cardiovascular disease mortality, whereas saturated fat had an inverse association with stroke. Global dietary guidelines should be reconsidered in light of these findings.
FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).
METHODS: We did a prospective cohort study (Prospective Urban Rural Epidemiology [PURE] in 135 335 individuals aged 35 to 70 years without cardiovascular disease from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. We documented their diet using country-specific food frequency questionnaires at baseline. Standardised questionnaires were used to collect information about demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease. The follow-up period varied based on the date when recruitment began at each site or country. The main clinical outcomes were major cardiovascular disease (defined as death from cardiovascular causes and non-fatal myocardial infarction, stroke, and heart failure), fatal and non-fatal myocardial infarction, fatal and non-fatal strokes, cardiovascular mortality, non-cardiovascular mortality, and total mortality. Cox frailty models with random effects were used to assess associations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.
FINDINGS: Participants were enrolled into the study between Jan 1, 2003, and March 31, 2013. For the current analysis, we included all unrefuted outcome events in the PURE study database through March 31, 2017. Overall, combined mean fruit, vegetable and legume intake was 3·91 (SD 2·77) servings per day. During a median 7·4 years (5·5-9·3) of follow-up, 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths were documented. Higher total fruit, vegetable, and legume intake was inversely associated with major cardiovascular disease, myocardial infarction, cardiovascular mortality, non-cardiovascular mortality, and total mortality in the models adjusted for age, sex, and centre (random effect). The estimates were substantially attenuated in the multivariable adjusted models for major cardiovascular disease (hazard ratio [HR] 0·90, 95% CI 0·74-1·10, ptrend=0·1301), myocardial infarction (0·99, 0·74-1·31; ptrend=0·2033), stroke (0·92, 0·67-1·25; ptrend=0·7092), cardiovascular mortality (0·73, 0·53-1·02; ptrend=0·0568), non-cardiovascular mortality (0·84, 0·68-1·04; ptrend =0·0038), and total mortality (0·81, 0·68-0·96; ptrend<0·0001). The HR for total mortality was lowest for three to four servings per day (0·78, 95% CI 0·69-0·88) compared with the reference group, with no further apparent decrease in HR with higher consumption. When examined separately, fruit intake was associated with lower risk of cardiovascular, non-cardiovascular, and total mortality, while legume intake was inversely associated with non-cardiovascular death and total mortality (in fully adjusted models). For vegetables, raw vegetable intake was strongly associated with a lower risk of total mortality, whereas cooked vegetable intake showed a modest benefit against mortality.
INTERPRETATION: Higher fruit, vegetable, and legume consumption was associated with a lower risk of non-cardiovascular, and total mortality. Benefits appear to be maximum for both non-cardiovascular mortality and total mortality at three to four servings per day (equivalent to 375-500 g/day).
FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).
OBJECTIVE: We evaluated distribution and interactive association of RTI and STI with survival outcomes of OHCA in four Asian metropolitan cities.
METHODS: An OHCA cohort from Pan-Asian Resuscitation Outcome Study (PAROS) conducted between January 2009 and December 2011 was analyzed. Adult EMS-treated cardiac arrests with presumed cardiac origin were included. A multivariable logistic regression model with an interaction term was used to evaluate the effect of STI according to different RTI categories on survival outcomes. Risk-adjusted predicted rates of survival outcomes were calculated and compared with observed rate.
RESULTS: A total of 16,974 OHCA cases were analyzed after serial exclusion. Median RTI was 6.0 min (interquartile range [IQR] 5.0-8.0 min) and median STI was 12.0 min (IQR 8.0-16.1). The prolonged STI in the longest RTI group was associated with a lower rate of survival to discharge or of survival 30 days after arrest (adjusted odds ratio [aOR] 0.59; 95% confidence interval [CI] 0.42-0.81), as well as a poorer neurologic outcome (aOR 0.63; 95% CI 0.41-0.97) without an increasing chance of prehospital return of spontaneous circulation (aOR 1.12; 95% CI 0.88-1.45).
CONCLUSIONS: Prolonged STI in OHCA with a delayed response time had a negative association with survival outcomes in four Asian metropolitan cities using the scoop-and-run EMS model. Establishing an optimal STI based on the response time could be considered.
METHODS: A longitudinal study of biopsy-proven NAFLD patients was conducted at the Asian tertiary hospital from November 2012 to January 2017. Patients with paired liver biopsies and LSM were followed prospectively for liver-related and non-liver related complications, and survival.
RESULTS: The data for 113 biopsy-proven NAFLD patients (mean age 51.3 ± 10.6 years, male 50%) were analyzed. At baseline, advanced fibrosis based on histology and LSM was observed in 22 and 46%, respectively. Paired liver biopsy and LSM at 1-year interval was available in 71 and 80% of patients, respectively. High-risk cases (defined as patients with advanced fibrosis at baseline who had no fibrosis improvement, and patients who developed advanced fibrosis on repeat assessment) were seen in 23 and 53% of patients, based on paired liver biopsy and LSM, respectively. Type 2 diabetes mellitus was independently associated with high-risk cases. The median follow-up was 37 months with a total follow-up of 328 person-years. High-risk cases based on paired liver biopsy had significantly higher rates of liver-related complications (p = 0.002) but no difference in other outcomes. High-risk patients based on paired LSM had a significantly higher rate of liver-related complications (p = 0.046), cardiovascular events (p = 0.025) and composite outcomes (p = 0.006).
CONCLUSION: Repeat LSM can predict liver-related complications, similar to paired liver biopsy, and may be useful in identifying patients who may be at an increased risk of cardiovascular events. Further studies in a larger cohort and with a longer follow-up should be carried out to confirm these observations.
METHODS: HIV-positive patients enrolled in the TREAT Asia HIV Observational Database who had used second-line ART for ≥6 months were included. ART use and rates and predictors of second-line treatment failure were evaluated.
RESULTS: There were 302 eligible patients. Most were male (76.5%) and exposed to HIV via heterosexual contact (71.5%). Median age at second-line initiation was 39.2 years, median CD4 cell count was 146 cells per cubic millimeter, and median HIV viral load was 16,224 copies per milliliter. Patients started second-line ART before 2007 (n = 105), 2007-2010 (n = 147) and after 2010 (n = 50). Ritonavir-boosted lopinavir and atazanavir accounted for the majority of protease inhibitor use after 2006. Median follow-up time on second-line therapy was 2.3 years. The rates of treatment failure and mortality per 100 patient/years were 8.8 (95% confidence interval: 7.1 to 10.9) and 1.1 (95% confidence interval: 0.6 to 1.9), respectively. Older age, high baseline viral load, and use of a protease inhibitor other than lopinavir or atazanavir were associated with a significantly shorter time to second-line failure.
CONCLUSIONS: Increased access to viral load monitoring to facilitate early detection of first-line ART failure and subsequent treatment switch is important for maximizing the durability of second-line therapy in Asia. Although second-line ART is highly effective in the region, the reported rate of failure emphasizes the need for third-line ART in a small portion of patients.
METHODOLOGY/PRINCIPAL FINDINGS: This yearlong field surveillance identified Ae. aegypti breeding in outdoor containers on an enormous scale. Through a sequence of experiments incorporating outdoors and indoors adapting as well as adapted populations, we observed that indoors provided better environment for the survival of Ae. aegypti and the observed death patterns could be explained on the basis of a difference in body size. The duration of gonotrophic period was much shorter in large-bodied females. Fecundity tended to be greater in indoor acclimated females. We also found increased tendency to multiple feeding in outdoors adapted females, which were smaller in size compared to their outdoors breeding counterparts.
CONCLUSION/SIGNIFICANCE: The data presented here suggest that acclimatization of Ae. aegypti to the outdoor environment may not decrease its lifespan or gonotrophic activity but rather increase breeding opportunities (increased number of discarded containers outdoors), the rate of larval development, but small body sizes at emergence. Size is likely to be correlated with disease transmission. In general, small size in Aedes females will favor increased blood-feeding frequency resulting in higher population sizes and disease occurrence.