Displaying publications 161 - 180 of 206 in total

Abstract:
Sort:
  1. Prando C, Samarina A, Bustamante J, Boisson-Dupuis S, Cobat A, Picard C, et al.
    Medicine (Baltimore), 2013 Mar;92(2):109-122.
    PMID: 23429356 DOI: 10.1097/MD.0b013e31828a01f9
    Autosomal recessive interleukin (IL)-12 p40 (IL-12p40) deficiency is a rare genetic etiology of mendelian susceptibility to mycobacterial disease (MSMD). We report the genetic, immunologic, and clinical features of 49 patients from 30 kindreds originating from 5 countries (India, Iran, Pakistan, Saudi Arabia, and Tunisia). There are only 9 different mutant alleles of the IL12B gene: 2 small insertions, 3 small deletions, 2 splice site mutations, and 1 large deletion, each causing a frameshift and leading to a premature stop codon, and 1 nonsense mutation. Four of these 9 variants are recurrent, affecting 25 of the 30 reported kindreds, due to founder effects in specific countries. All patients are homozygous and display complete IL-12p40 deficiency. As a result, the patients lack detectable IL-12p70 and IL-12p40 and have low levels of interferon gamma (IFN-γ). The clinical features are characterized by childhood onset of bacille Calmette-Guérin (attenuated Mycobacterium bovis strain) (BCG) and Salmonella infections, with recurrences of salmonellosis (36.4%) more common than recurrences of mycobacterial disease (25%). BCG vaccination led to BCG disease in 40 of the 41 patients vaccinated (97.5%). Multiple mycobacterial infections were rare, observed in only 3 patients, whereas the association of salmonellosis and mycobacteriosis was observed in 9 patients. A few other infections were diagnosed, including chronic mucocutaneous candidiasis (n = 3), nocardiosis (n = 2), and klebsiellosis (n = 1). IL-12p40 deficiency has a high but incomplete clinical penetrance, with 33.3% of genetically affected relatives of index cases showing no symptoms. However, the prognosis is poor, with mortality rates of up to 28.6%. Overall, the clinical phenotype of IL-12p40 deficiency closely resembles that of interleukin 12 receptor β1 (IL-12Rβ1) deficiency. In conclusion, IL-12p40 deficiency is more common than initially thought and should be considered worldwide in patients with MSMD and other intramacrophagic infectious diseases, salmonellosis in particular.
    Matched MeSH terms: Survival Analysis
  2. Saxena N, Hartman M, Bhoo-Pathy N, Lim JN, Aw TC, Iau P, et al.
    World J Surg, 2012 Dec;36(12):2838-46.
    PMID: 22926282 DOI: 10.1007/s00268-012-1746-2
    There are large differences in socio-economic growth within the region of South East Asia, leading to sharp contrasts in health-systems development between countries. This study compares breast cancer presentation and outcome between patients from a high income country (Singapore) and a middle income country (Malaysia) in South East Asia.
    Matched MeSH terms: Survival Analysis
  3. Carroll RP, Deayton S, Emery T, Munasinghe W, Tsiopelas E, Fleet A, et al.
    Hum Immunol, 2019 Aug;80(8):573-578.
    PMID: 31014826 DOI: 10.1016/j.humimm.2019.04.005
    High levels of angiotensin receptor antibodies (ATRab) are associated with acute cellular and humoral rejection, vascular occlusion, de novo human leucocyte antigen donor specific antibody (HLA DSA) and poor graft survival in kidney transplant recipients (KTR). Since 2015 we proactively managed patients "at risk" (AR) with ATRab >17 U/ml with perioperative plasma exchange (PLEX) and/or angiotensin receptor blockade (ARB). 44 patients were treated with this protocol. 265 KTR with ATRab ≤17 U/ml deemed "low risk" (LR) were transplanted under standard conditions. PLEX and ARB were not associated with increased risk of: delayed graft function requiring haemodialysis (HDx), hyperkalaemia >5.5 mmol/l requiring HDx, and the combined clinical end-point of severe hypotension, blood transfusion and re-operation for bleeding. Rejection rates were similar at 90 days: 8/44 (18%) in the AR group and 36/265 (14%) in the LR group (p = 0.350). Death censored graft survival was the same between the AR and LR groups with a 94% 48-month graft survival - hazard ratio (log-rank) 1.16 [95% CI 0.2-5.8] p = 0.844. Proactive treatment of ATRab >17 U/ml with PLEX and/or ARB is not associated with increased rates of perioperative complications and comparable rates of rejection and death censored graft survival at 4 years compared to KTR <17 U/ml ATRab.
    Matched MeSH terms: Survival Analysis
  4. Leong MC, Ahmed Alhassan AA, Sivalingam S, Alwi M
    Ann Thorac Surg, 2019 09;108(3):813-819.
    PMID: 30998905 DOI: 10.1016/j.athoracsur.2019.03.045
    BACKGROUND: Ductal stenting is performed to retrain involuted left ventricles (LVs) in patients with d-transposition of the great arteries and intact ventricular septum (TGA-IVS). However, its efficacy is largely unknown. This study aimed to determine the safety and efficacy of ductal stenting in retraining of the involuted LV in patients with TGA-IVS.

    METHODS: This was a single-center, retrospective study. Echocardiographic assessment of the LV geometry, mass, and free wall thickness was performed before stenting and before the arterial switch operation. Patients then underwent the arterial switch operation, and the postoperative outcomes were reviewed.

    RESULTS: There were 11 consecutive patients (male, 81.8%; mean age at stenting, 43.11 ± 18.19 days) with TGA-IVS with involuted LV who underwent LV retraining by ductal stenting from July 2013 to December 2017. Retraining by ductus stenting failed in 4 patients (36.3%). Two patients required pulmonary artery banding, and another 2 had an LV mass index of less than 35 g/m2. Patients in the successful group had improved LV mass index from 45.14 ± 17.91 to 81.86 ± 33.11g/m2 (p = 0.023) compared with 34.50 ± 10.47 to 20.50 ± 9.88 g/m2 (p = 0.169) and improved LV geometry after ductal stenting. The failed group was associated with an increased need for extracorporeal support (14.5% vs 50%, p = 0.012). An atrial septal defect-to-interatrial septum length ratio of more than 0.38 was associated with failed LV retraining.

    CONCLUSIONS: Ductal stenting is an effective method to retrain the involuted LV in TGA-IVS. A large atrial septal defect (atrial septal defect-to-interatrial septum length ratio >0.38) was associated with poor response to LV retraining.

    Matched MeSH terms: Survival Analysis
  5. Khor GH, Froemming GR, Zain RB, Abraham MT, Omar E, Tan SK, et al.
    Int J Med Sci, 2013;10(12):1727-39.
    PMID: 24155659 DOI: 10.7150/ijms.6884
    BACKGROUND: Hypermethylation in promoter regions of genes might lead to altered gene functions and result in malignant cellular transformation. Thus, biomarker identification for hypermethylated genes would be very useful for early diagnosis, prognosis, and therapeutic treatment of oral squamous cell carcinoma (OSCC). The objectives of this study were to screen and validate differentially hypermethylated genes in OSCC and correlate the hypermethylation-induced genes with demographic, clinocopathological characteristics and survival rate of OSCC.

    METHODS: DNA methylation profiling was utilized to screen the differentially hypermethylated genes in OSCC. Three selected differentially-hypermethylated genes of p16, DDAH2 and DUSP1 were further validated for methylation status and protein expression. The correlation between demographic, clinicopathological characteristics, and survival rate of OSCC patients with hypermethylation of p16, DDAH2 and DUSP1 genes were analysed in the study.

    RESULTS: Methylation profiling demonstrated 33 promoter hypermethylated genes in OSCC. The differentially-hypermethylated genes of p16, DDAH2 and DUSP1 revealed positivity of 78%, 80% and 88% in methylation-specific polymerase chain reaction and 24% and 22% of immunoreactivity in DDAH2 and DUSP1 genes, respectively. Promoter hypermethylation of p16 gene was found significantly associated with tumour site of buccal, gum, tongue and lip (P=0.001). In addition, DDAH2 methylation level was correlated significantly with patients' age (P=0.050). In this study, overall five-year survival rate was 38.1% for OSCC patients and was influenced by sex difference.

    CONCLUSIONS: The study has identified 33 promoter hypermethylated genes that were significantly silenced in OSCC, which might be involved in an important mechanism in oral carcinogenesis. Our approaches revealed signature candidates of differentially hypermethylated genes of DDAH2 and DUSP1 which can be further developed as potential biomarkers for OSCC as diagnostic, prognostic and therapeutic targets in the future.

    Matched MeSH terms: Survival Analysis
  6. Yeo KK, Tai BC, Heng D, Lee JM, Ma S, Hughes K, et al.
    Diabetologia, 2006 Dec;49(12):2866-73.
    PMID: 17021918 DOI: 10.1007/s00125-006-0469-z
    AIMS/HYPOTHESIS: The aim of the study was to determine whether the risk of ischaemic heart disease (IHD) associated with diabetes mellitus differs between ethnic groups.

    METHODS: Registry linkage was used to identify IHD events in 5707 Chinese, Malay and Asian Indian participants from three cross-sectional studies conducted in Singapore between the years 1984 and 1995. The study provided a median of 10.2 years of follow-up with 240 IHD events experienced. We assessed the interaction between diabetes mellitus and ethnicity in relation to the risk of IHD events using Cox proportional hazards regression.

    RESULTS: Diabetes mellitus was more common in Asian Indians. Furthermore, diabetes mellitus was associated with a greater risk of IHD in Asian Indians. The hazard ratio when comparing diabetes mellitus with non-diabetes mellitus was 6.41 (95% CI 5.77-7.12) in Asian Indians and 3.07 (95% CI 1.86-5.06) in Chinese (p = 0.009 for interaction). Differences in the levels of established IHD risk factors among diabetics from the three ethnic groups did not appear to explain the differences in IHD risk.

    CONCLUSIONS/INTERPRETATION: Asian Indians are more susceptible to the development of diabetes mellitus than Chinese and Malays. When Asian Indians do develop diabetes mellitus, the risk of IHD is higher than for Chinese and Malays. Consequently, the prevention of diabetes mellitus amongst this ethnic group is particularly important for the prevention of IHD in Asia, especially given the size of the population at risk. Elucidation of the reasons for these ethnic differences may help us understand the pathogenesis of IHD in those with diabetes mellitus.
    Matched MeSH terms: Survival Analysis
  7. Nurul Aiezzah Z, Noor E, Hasidah MS
    Trop Biomed, 2010 Dec;27(3):624-31.
    PMID: 21399604 MyJurnal
    Malaria, caused by the Plasmodium parasite is still a health problem worldwide due to resistance of the pathogen to current anti-malarials. The search for new anti-malarial agents has become more crucial with the emergence of chloroquine-resistant Plasmodium falciparum strains. Protein kinases such as mitogen-activated protein kinase (MAPK), MAPK kinase, cyclin-dependent kinase (CDK) and glycogen synthase kinase- 3(GSK-3) of parasitic protozoa are potential drug targets. GSK-3 is an enzyme that plays a vital role in multiple cellular processes, and has been linked to pathogenesis of several diseases such as type II diabetes and Alzheimer's disease. In the present study, the antiplasmodial property of LiCl, a known GSK-3 inhibitor, was evaluated in vivo for its antimalarial effect against mice infected with Plasmodium berghei. Infected ICR mice were intraperitoneally administered with LiCl for four consecutive days before (prophylactic test) and after (suppressive test) inoculation of P. berghei-parasitised erythrocytes. Results from the suppressive test (post-infection LiCl treatment) showed inhibition of erythrocytic parasitemia development by 62.06%, 85.67% and 85.18% as compared to nontreated controls for the 100 mg/kg, 300 mg/kg and 600 mg/kg dosages respectively. Both 300 mg/kg and 600 mg/kg LiCl showed similar significant (P<0.05) suppressive values to that obtained with chloroquine-treated mice (86% suppression). The prophylactic test indicated a significantly (P<0.05) high protective effect on mice pre-treated with LiCl with suppression levels relatively comparable to chloroquine (84.07% and 86.26% suppression for the 300 mg/kg and 600 mg/kg LiCl dosages respectively versus 92.86% suppression by chloroquine). In both the suppressive and prophylactic tests, LiCl-treated animals survived longer than their non-treated counterparts. Mortality of the non-treated mice was 100% within 6 to 7 days of parasite inoculation whereas mice administered with LiCl survived beyond 9 days. Healthy non-infected mice administered with 600 mg/ kg LiCl for four consecutive days also showed decreased mortality compared to animals receiving lower doses of LiCl; three of the seven mice intraperitoneally injected with the former dose of LiCl did not survive more than 24 h after administration of LiCl whereas animals given the lower LiCl doses survived beyond four days of LiCl administration. To date, no direct evidence of anti-malarial activity in vivo or in vitro has been reported for LiCl. Evidence of anti-plasmodial activity of lithium in a mouse infection model is presented in this study.
    Matched MeSH terms: Survival Analysis
  8. Wah W, Wai KL, Pek PP, Ho AFW, Alsakaf O, Chia MYC, et al.
    Am J Emerg Med, 2017 Feb;35(2):206-213.
    PMID: 27810251 DOI: 10.1016/j.ajem.2016.10.042
    BACKGROUND: In out of hospital cardiac arrest (OHCA), the prognostic influence of conversion to shockable rhythms during resuscitation for initially non-shockable rhythms remains unknown. This study aimed to assess the relationship between initial and subsequent shockable rhythm and post-arrest survival and neurological outcomes after OHCA.

    METHODOLOGY: This was a retrospective analysis of all OHCA cases collected from the Pan-Asian Resuscitation Outcomes Study (PAROS) registry in 7 countries in Asia between 2009 and 2012. We included OHCA cases of presumed cardiac etiology, aged 18-years and above and resuscitation attempted by EMS. We performed multivariate logistic regression analyses to assess the relationship between initial and subsequent shockable rhythm and survival and neurological outcomes. 2-stage seemingly unrelated bivariate probit models were developed to jointly model the survival and neurological outcomes. We adjusted for the clustering effects of country variance in all models.

    RESULTS: 40,160 OHCA cases met the inclusion criteria. There were 5356 OHCA cases (13.3%) with initial shockable rhythm and 33,974 (84.7%) with initial non-shockable rhythm. After adjustment of baseline and prehospital characteristics, OHCA with initial shockable rhythm (odds ratio/OR=6.10, 95% confidence interval/CI=5.06-7.34) and subsequent conversion to shockable rhythm (OR=2.00,95%CI=1.10-3.65) independently predicted better survival-to-hospital-discharge outcomes. Subsequent shockable rhythm conversion significantly improved survival-to-admission, discharge and post-arrest overall and cerebral performance outcomes in the multivariate logistic regression and 2-stage analyses.

    CONCLUSION: Initial shockable rhythm was the strongest predictor for survival. However, conversion to subsequent shockable rhythm significantly improved post-arrest survival and neurological outcomes. This study suggests the importance of early resuscitation efforts even for initially non-shockable rhythms which has prognostic implications and selection of subsequent post-resuscitation therapy.

    Matched MeSH terms: Survival Analysis
  9. Prasad U, Wahid MI, Jalaludin MA, Abdullah BJ, Paramsothy M, Abdul-Kareem S
    Int J Radiat Oncol Biol Phys, 2002 Jul 1;53(3):648-55.
    PMID: 12062608
    To assess the long-term survival of patients with nasopharyngeal carcinoma (NPC) who were treated with conventional radical radiotherapy (RT) followed by adjuvant chemotherapy.
    Matched MeSH terms: Survival Analysis
  10. Wong KW, Lojikip S, Chan FS, Goh KW, Pang HC
    Med J Malaysia, 2017 06;72(3):179-185.
    PMID: 28733566 MyJurnal
    AIM: To study the epidemiology, clinical characteristics, vascular access, and the short term survival of ESRD patients initiated on dialysis from Hospital Queen Elizabeth (HQE).

    BACKGROUND: The number of patients with ESRD is increasing in Sabah, Malaysia. Most patients present late and some live in remote areas with difficult access to healthcare services. Many therefore present with potentially fatal complications.

    METHODS: All the newly confirmed ESRD patients who were initiated on renal replacement therapy (RRT) from 1 January to 31 December 2014 were included. The basic epidemiological and clinical data were collected. They were divided into three groups: Group 1 - those known to the medical service and had been prepared properly for the initiation of RRT; Group 2 - those known to the medical service, but were not prepared for the RRT; Group 3 - those with undiagnosed CKD. Outcome is mainly survival at 3rd, 6th, 9th and 12th month.

    RESULTS: There were 249 ESRD patients. 153 (61.4%) were male. The average age was 53.3 (range 12 - 83). The main cause of ESRD was diabetic nephropathy (128 patients, 51.4%). Most patients were started on RRT with a catheter (74.3%), 47 patients (18.9%) with a fistula, and 17 patients (6.8%) with a Tenckhoff catheter. 185 (74.3%) patients were not prepared properly (Group 2 - 66.3%, and Group 3 - 8.0%). The survival for 249 patients were 86.3% at 6 months, 77.9% at 12 months. Group 2 has the worst survival (81.9% at 6 months, 71.1% at 12 months).

    CONCLUSIONS: Our data showed that most patients (74.3%) were started on dialysis in an unplanned manner with poor survival. A comprehensive and well-supported predialysis programme is needed.
    Matched MeSH terms: Survival Analysis
  11. Sutiman N, Nwe MS, Ni Lai EE, Lee DK, Chan MY, Eng-Juh Yeoh A, et al.
    Clin Lymphoma Myeloma Leuk, 2021 03;21(3):e290-e300.
    PMID: 33384264 DOI: 10.1016/j.clml.2020.11.016
    PURPOSE: To determine the prognostic factors in pediatric patients with acute myeloid leukemia (AML) and to assess whether their outcomes have improved over time.

    PATIENTS AND METHODS: Sixty-two patients with AML excluding acute promyelocytic leukemia were retrospectively analyzed. Patients in the earlier cohort (n = 36) were treated on the Medical Research Council (MRC) AML12 protocol, whereas those in the recent cohort (n = 26) were treated on the Malaysia-Singapore AML protocol (MASPORE 2006), which differed in terms of risk group stratification, cumulative anthracycline dose, and timing of hematopoietic stem-cell transplantation for high-risk patients.

    RESULTS: Significant improvements in 10-year overall survival and event-free survival were observed in patients treated with the recent MASPORE 2006 protocol compared to the earlier MRC AML12 protocol (overall survival: 88.0% ± 6.5% vs 50.1% ± 8.6%, P = .002; event-free survival: 72.1% ± 9.0 vs 50.1% ± 8.6%, P = .045). In univariate analysis, patients in the recent cohort had significantly lower intensive care unit admission rate (11.5% vs 47.2%, P = .005) and numerically lower relapse rate (26.9% vs 50.0%, P = .068) compared to the earlier cohort. Multivariate analysis showed that treatment protocol was the only independent predictive factor for overall survival (hazard ratio = 0.21; 95% confidence interval, 0.06-0.73, P = .014).

    CONCLUSION: Outcomes of pediatric AML patients have improved over time. The more recent MASPORE 2006 protocol led to significant improvement in long-term survival rates and reduction in intensive care unit admission rate.

    Matched MeSH terms: Survival Analysis
  12. Ohno T, Thinh DH, Kato S, Devi CR, Tung NT, Thephamongkhol K, et al.
    J Radiat Res, 2013 May;54(3):467-73.
    PMID: 23192700 DOI: 10.1093/jrr/rrs115
    The purpose of this study was to evaluate the efficacy and toxicity of radiotherapy concurrently with weekly cisplatin, followed by adjuvant chemotherapy, for the treatment of N2-3 nasopharyngeal cancer (NPC) in Asian countries, especially regions of South and Southeast Asian countries where NPC is endemic. Between 2005 and 2009, 121 patients with NPC (T1-4 N2-3 M0) were registered from Vietnam, Malaysia, Indonesia, Thailand, The Philippines, China and Bangladesh. Patients were treated with 2D radiotherapy concurrently with weekly cisplatin (30 mg/m (2)), followed by adjuvant chemotherapy, consisting of cisplatin (80 mg/m(2) on Day 1) and fluorouracil (800 mg/m(2) on Days 1-5) for 3 cycles. Of the 121 patients, 56 patients (46%) required interruption of RT. The reasons for interruption of RT were acute non-hematological toxicities such as mucositis, pain and dermatitis in 35 patients, hematological toxicities in 11 patients, machine break-down in 3 patients, poor general condition in 2 patients, and others in 8 patients. Of the patients, 93% completed at least 4 cycles of weekly cisplatin during radiotherapy, and 82% completed at least 2 cycles of adjuvant chemotherapy. With a median follow-up time of 46 months for the surviving 77 patients, the 3-year locoregional control, distant metastasis-free survival and overall survival rates were 89%, 74% and 66%, respectively. No treatment-related deaths occurred. Grade 3-4 toxicities of mucositis, nausea/vomiting and leukopenia were observed in 34%, 4% and 4% of the patients, respectively. In conclusion, further improvement in survival and locoregional control is necessary, although our regimen showed acceptable toxicities.
    Matched MeSH terms: Survival Analysis
  13. Dehghan M, Mente A, Zhang X, Swaminathan S, Li W, Mohan V, et al.
    Lancet, 2017 Nov 04;390(10107):2050-2062.
    PMID: 28864332 DOI: 10.1016/S0140-6736(17)32252-3
    BACKGROUND: The relationship between macronutrients and cardiovascular disease and mortality is controversial. Most available data are from European and North American populations where nutrition excess is more likely, so their applicability to other populations is unclear.

    METHODS: The Prospective Urban Rural Epidemiology (PURE) study is a large, epidemiological cohort study of individuals aged 35-70 years (enrolled between Jan 1, 2003, and March 31, 2013) in 18 countries with a median follow-up of 7·4 years (IQR 5·3-9·3). Dietary intake of 135 335 individuals was recorded using validated food frequency questionnaires. The primary outcomes were total mortality and major cardiovascular events (fatal cardiovascular disease, non-fatal myocardial infarction, stroke, and heart failure). Secondary outcomes were all myocardial infarctions, stroke, cardiovascular disease mortality, and non-cardiovascular disease mortality. Participants were categorised into quintiles of nutrient intake (carbohydrate, fats, and protein) based on percentage of energy provided by nutrients. We assessed the associations between consumption of carbohydrate, total fat, and each type of fat with cardiovascular disease and total mortality. We calculated hazard ratios (HRs) using a multivariable Cox frailty model with random intercepts to account for centre clustering.

    FINDINGS: During follow-up, we documented 5796 deaths and 4784 major cardiovascular disease events. Higher carbohydrate intake was associated with an increased risk of total mortality (highest [quintile 5] vs lowest quintile [quintile 1] category, HR 1·28 [95% CI 1·12-1·46], ptrend=0·0001) but not with the risk of cardiovascular disease or cardiovascular disease mortality. Intake of total fat and each type of fat was associated with lower risk of total mortality (quintile 5 vs quintile 1, total fat: HR 0·77 [95% CI 0·67-0·87], ptrend<0·0001; saturated fat, HR 0·86 [0·76-0·99], ptrend=0·0088; monounsaturated fat: HR 0·81 [0·71-0·92], ptrend<0·0001; and polyunsaturated fat: HR 0·80 [0·71-0·89], ptrend<0·0001). Higher saturated fat intake was associated with lower risk of stroke (quintile 5 vs quintile 1, HR 0·79 [95% CI 0·64-0·98], ptrend=0·0498). Total fat and saturated and unsaturated fats were not significantly associated with risk of myocardial infarction or cardiovascular disease mortality.

    INTERPRETATION: High carbohydrate intake was associated with higher risk of total mortality, whereas total fat and individual types of fat were related to lower total mortality. Total fat and types of fat were not associated with cardiovascular disease, myocardial infarction, or cardiovascular disease mortality, whereas saturated fat had an inverse association with stroke. Global dietary guidelines should be reconsidered in light of these findings.

    FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).

    Matched MeSH terms: Survival Analysis
  14. Miller V, Mente A, Dehghan M, Rangarajan S, Zhang X, Swaminathan S, et al.
    Lancet, 2017 Nov 04;390(10107):2037-2049.
    PMID: 28864331 DOI: 10.1016/S0140-6736(17)32253-5
    BACKGROUND: The association between intake of fruits, vegetables, and legumes with cardiovascular disease and deaths has been investigated extensively in Europe, the USA, Japan, and China, but little or no data are available from the Middle East, South America, Africa, or south Asia.

    METHODS: We did a prospective cohort study (Prospective Urban Rural Epidemiology [PURE] in 135 335 individuals aged 35 to 70 years without cardiovascular disease from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. We documented their diet using country-specific food frequency questionnaires at baseline. Standardised questionnaires were used to collect information about demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease. The follow-up period varied based on the date when recruitment began at each site or country. The main clinical outcomes were major cardiovascular disease (defined as death from cardiovascular causes and non-fatal myocardial infarction, stroke, and heart failure), fatal and non-fatal myocardial infarction, fatal and non-fatal strokes, cardiovascular mortality, non-cardiovascular mortality, and total mortality. Cox frailty models with random effects were used to assess associations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.

    FINDINGS: Participants were enrolled into the study between Jan 1, 2003, and March 31, 2013. For the current analysis, we included all unrefuted outcome events in the PURE study database through March 31, 2017. Overall, combined mean fruit, vegetable and legume intake was 3·91 (SD 2·77) servings per day. During a median 7·4 years (5·5-9·3) of follow-up, 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths were documented. Higher total fruit, vegetable, and legume intake was inversely associated with major cardiovascular disease, myocardial infarction, cardiovascular mortality, non-cardiovascular mortality, and total mortality in the models adjusted for age, sex, and centre (random effect). The estimates were substantially attenuated in the multivariable adjusted models for major cardiovascular disease (hazard ratio [HR] 0·90, 95% CI 0·74-1·10, ptrend=0·1301), myocardial infarction (0·99, 0·74-1·31; ptrend=0·2033), stroke (0·92, 0·67-1·25; ptrend=0·7092), cardiovascular mortality (0·73, 0·53-1·02; ptrend=0·0568), non-cardiovascular mortality (0·84, 0·68-1·04; ptrend =0·0038), and total mortality (0·81, 0·68-0·96; ptrend<0·0001). The HR for total mortality was lowest for three to four servings per day (0·78, 95% CI 0·69-0·88) compared with the reference group, with no further apparent decrease in HR with higher consumption. When examined separately, fruit intake was associated with lower risk of cardiovascular, non-cardiovascular, and total mortality, while legume intake was inversely associated with non-cardiovascular death and total mortality (in fully adjusted models). For vegetables, raw vegetable intake was strongly associated with a lower risk of total mortality, whereas cooked vegetable intake showed a modest benefit against mortality.

    INTERPRETATION: Higher fruit, vegetable, and legume consumption was associated with a lower risk of non-cardiovascular, and total mortality. Benefits appear to be maximum for both non-cardiovascular mortality and total mortality at three to four servings per day (equivalent to 375-500 g/day).

    FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).

    Matched MeSH terms: Survival Analysis
  15. Kim TH, Lee K, Shin SD, Ro YS, Tanaka H, Yap S, et al.
    J Emerg Med, 2017 Nov;53(5):688-696.e1.
    PMID: 29128033 DOI: 10.1016/j.jemermed.2017.08.076
    BACKGROUND: Response time interval (RTI) and scene time interval (STI) are key time variables in the out-of-hospital cardiac arrest (OHCA) cases treated and transported via emergency medical services (EMS).

    OBJECTIVE: We evaluated distribution and interactive association of RTI and STI with survival outcomes of OHCA in four Asian metropolitan cities.

    METHODS: An OHCA cohort from Pan-Asian Resuscitation Outcome Study (PAROS) conducted between January 2009 and December 2011 was analyzed. Adult EMS-treated cardiac arrests with presumed cardiac origin were included. A multivariable logistic regression model with an interaction term was used to evaluate the effect of STI according to different RTI categories on survival outcomes. Risk-adjusted predicted rates of survival outcomes were calculated and compared with observed rate.

    RESULTS: A total of 16,974 OHCA cases were analyzed after serial exclusion. Median RTI was 6.0 min (interquartile range [IQR] 5.0-8.0 min) and median STI was 12.0 min (IQR 8.0-16.1). The prolonged STI in the longest RTI group was associated with a lower rate of survival to discharge or of survival 30 days after arrest (adjusted odds ratio [aOR] 0.59; 95% confidence interval [CI] 0.42-0.81), as well as a poorer neurologic outcome (aOR 0.63; 95% CI 0.41-0.97) without an increasing chance of prehospital return of spontaneous circulation (aOR 1.12; 95% CI 0.88-1.45).

    CONCLUSIONS: Prolonged STI in OHCA with a delayed response time had a negative association with survival outcomes in four Asian metropolitan cities using the scoop-and-run EMS model. Establishing an optimal STI based on the response time could be considered.

    Matched MeSH terms: Survival Analysis
  16. Kamarajah SK, Chan WK, Nik Mustapha NR, Mahadeva S
    Hepatol Int, 2018 Jan;12(1):44-55.
    PMID: 29372507 DOI: 10.1007/s12072-018-9843-4
    INTRODUCTION: The value of repeated liver stiffness measurement (LSM) in non-alcoholic fatty liver disease (NAFLD) has not been shown before.

    METHODS: A longitudinal study of biopsy-proven NAFLD patients was conducted at the Asian tertiary hospital from November 2012 to January 2017. Patients with paired liver biopsies and LSM were followed prospectively for liver-related and non-liver related complications, and survival.

    RESULTS: The data for 113 biopsy-proven NAFLD patients (mean age 51.3 ± 10.6 years, male 50%) were analyzed. At baseline, advanced fibrosis based on histology and LSM was observed in 22 and 46%, respectively. Paired liver biopsy and LSM at 1-year interval was available in 71 and 80% of patients, respectively. High-risk cases (defined as patients with advanced fibrosis at baseline who had no fibrosis improvement, and patients who developed advanced fibrosis on repeat assessment) were seen in 23 and 53% of patients, based on paired liver biopsy and LSM, respectively. Type 2 diabetes mellitus was independently associated with high-risk cases. The median follow-up was 37 months with a total follow-up of 328 person-years. High-risk cases based on paired liver biopsy had significantly higher rates of liver-related complications (p = 0.002) but no difference in other outcomes. High-risk patients based on paired LSM had a significantly higher rate of liver-related complications (p = 0.046), cardiovascular events (p = 0.025) and composite outcomes (p = 0.006).

    CONCLUSION: Repeat LSM can predict liver-related complications, similar to paired liver biopsy, and may be useful in identifying patients who may be at an increased risk of cardiovascular events. Further studies in a larger cohort and with a longer follow-up should be carried out to confirm these observations.

    Matched MeSH terms: Survival Analysis
  17. Boettiger DC, Nguyen VK, Durier N, Bui HV, Heng Sim BL, Azwa I, et al.
    J Acquir Immune Defic Syndr, 2015 Feb 01;68(2):186-95.
    PMID: 25590271 DOI: 10.1097/QAI.0000000000000411
    BACKGROUND: Roughly 4% of the 1.25 million patients on antiretroviral therapy (ART) in Asia are using second-line therapy. To maximize patient benefit and regional resources, it is important to optimize the timing of second-line ART initiation and use the most effective compounds available.

    METHODS: HIV-positive patients enrolled in the TREAT Asia HIV Observational Database who had used second-line ART for ≥6 months were included. ART use and rates and predictors of second-line treatment failure were evaluated.

    RESULTS: There were 302 eligible patients. Most were male (76.5%) and exposed to HIV via heterosexual contact (71.5%). Median age at second-line initiation was 39.2 years, median CD4 cell count was 146 cells per cubic millimeter, and median HIV viral load was 16,224 copies per milliliter. Patients started second-line ART before 2007 (n = 105), 2007-2010 (n = 147) and after 2010 (n = 50). Ritonavir-boosted lopinavir and atazanavir accounted for the majority of protease inhibitor use after 2006. Median follow-up time on second-line therapy was 2.3 years. The rates of treatment failure and mortality per 100 patient/years were 8.8 (95% confidence interval: 7.1 to 10.9) and 1.1 (95% confidence interval: 0.6 to 1.9), respectively. Older age, high baseline viral load, and use of a protease inhibitor other than lopinavir or atazanavir were associated with a significantly shorter time to second-line failure.

    CONCLUSIONS: Increased access to viral load monitoring to facilitate early detection of first-line ART failure and subsequent treatment switch is important for maximizing the durability of second-line therapy in Asia. Although second-line ART is highly effective in the region, the reported rate of failure emphasizes the need for third-line ART in a small portion of patients.

    Matched MeSH terms: Survival Analysis
  18. Saifur RG, Dieng H, Hassan AA, Salmah MR, Satho T, Miake F, et al.
    PLoS One, 2012;7(2):e30919.
    PMID: 22363516 DOI: 10.1371/journal.pone.0030919
    BACKGROUND: The domestic dengue vector Aedes aegypti mosquitoes breed in indoor containers. However, in northern peninsular Malaysia, they show equal preference for breeding in both indoor and outdoor habitats. To evaluate the epidemiological implications of this peridomestic adaptation, we examined whether Ae. aegypti exhibits decreased survival, gonotrophic activity, and fecundity due to lack of host availability and the changing breeding behavior.

    METHODOLOGY/PRINCIPAL FINDINGS: This yearlong field surveillance identified Ae. aegypti breeding in outdoor containers on an enormous scale. Through a sequence of experiments incorporating outdoors and indoors adapting as well as adapted populations, we observed that indoors provided better environment for the survival of Ae. aegypti and the observed death patterns could be explained on the basis of a difference in body size. The duration of gonotrophic period was much shorter in large-bodied females. Fecundity tended to be greater in indoor acclimated females. We also found increased tendency to multiple feeding in outdoors adapted females, which were smaller in size compared to their outdoors breeding counterparts.

    CONCLUSION/SIGNIFICANCE: The data presented here suggest that acclimatization of Ae. aegypti to the outdoor environment may not decrease its lifespan or gonotrophic activity but rather increase breeding opportunities (increased number of discarded containers outdoors), the rate of larval development, but small body sizes at emergence. Size is likely to be correlated with disease transmission. In general, small size in Aedes females will favor increased blood-feeding frequency resulting in higher population sizes and disease occurrence.

    Matched MeSH terms: Survival Analysis
  19. Pan JW, Zabidi MMA, Ng PS, Meng MY, Hasan SN, Sandey B, et al.
    Nat Commun, 2020 Dec 22;11(1):6433.
    PMID: 33353943 DOI: 10.1038/s41467-020-20173-5
    Molecular profiling of breast cancer has enabled the development of more robust molecular prognostic signatures and therapeutic options for breast cancer patients. However, non-Caucasian populations remain understudied. Here, we present the mutational, transcriptional, and copy number profiles of 560 Malaysian breast tumours and a comparative analysis of breast cancers arising in Asian and Caucasian women. Compared to breast tumours in Caucasian women, we show an increased prevalence of HER2-enriched molecular subtypes and higher prevalence of TP53 somatic mutations in ER+ Asian breast tumours. We also observe elevated immune scores in Asian breast tumours, suggesting potential clinical response to immune checkpoint inhibitors. Whilst HER2-subtype and enriched immune score are associated with improved survival, presence of TP53 somatic mutations is associated with poorer survival in ER+ tumours. Taken together, these population differences unveil opportunities to improve the understanding of this disease and lay the foundation for precision medicine in different populations.
    Matched MeSH terms: Survival Analysis
  20. Sharma V, Kaushik S, Kumar R, Yadav JP, Kaushik S
    Rev Med Virol, 2019 Jan;29(1):e2010.
    PMID: 30251294 DOI: 10.1002/rmv.2010
    Since emergence of the Nipah virus (NiV) in 1998 from Malaysia, the NiV virus has reappeared on different occasions causing severe infections in human population associated with high rate of mortality. NiV has been placed along with Hendra virus in genus Henipavirus of family Paramyxoviridae. Fruit bats (Genus Pteropus) are known to be natural host and reservoir of NiV. During the outbreaks from Malaysia and Singapore, the roles of pigs as intermediate host were confirmed. The infection transmitted from bats to pigs and subsequently from pigs to humans. Severe encephalitis was reported in NiV infection often associated with neurological disorders. First NiV outbreak in India occurred in Siliguri district of West Bengal in 2001, where direct transmission of the NiV virus from bats-to-human and human-to-human was reported in contrast to the role of pigs in the Malaysian NiV outbreak. Regular NiV outbreaks have been reported from Bangladesh since 2001 to 2015. The latest outbreak of NiV has been recorded in May, 2018 from Kerala, India which resulted in the death of 17 individuals. Due to lack of vaccines and effective antivirals, Nipah encephalitis poses a great threat to public health. Routine surveillance studies in the infected areas can be useful in detecting early signs of infection and help in containment of these outbreaks.
    Matched MeSH terms: Survival Analysis
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links