Displaying all 2 publications

Abstract:
Sort:
  1. Tai DT, Nhu NT, Tuan PA, Suleiman A, Omer H, Alirezaei Z, et al.
    J Xray Sci Technol, 2024 Apr 09.
    PMID: 38607727 DOI: 10.3233/XST-230255
    BACKGROUND: Accurate diagnosis and subsequent delineated treatment planning require the experience of clinicians in the handling of their case numbers. However, applying deep learning in image processing is useful in creating tools that promise faster high-quality diagnoses, but the accuracy and precision of 3-D image processing from 2-D data may be limited by factors such as superposition of organs, distortion and magnification, and detection of new pathologies. The purpose of this research is to use radiomics and deep learning to develop a tool for lung cancer diagnosis.

    METHODS: This study applies radiomics and deep learning in the diagnosis of lung cancer to help clinicians accurately analyze the images and thereby provide the appropriate treatment planning. 86 patients were recruited from Bach Mai Hospital, and 1012 patients were collected from an open-source database. First, deep learning has been applied in the process of segmentation by U-NET and cancer classification via the use of the DenseNet model. Second, the radiomics were applied for measuring and calculating diameter, surface area, and volume. Finally, the hardware also was designed by connecting between Arduino Nano and MFRC522 module for reading data from the tag. In addition, the displayed interface was created on a web platform using Python through Streamlit.

    RESULTS: The applied segmentation model yielded a validation loss of 0.498, a train loss of 0.27, a cancer classification validation loss of 0.78, and a training accuracy of 0.98. The outcomes of the diagnostic capabilities of lung cancer (recognition and classification of lung cancer from chest CT scans) were quite successful.

    CONCLUSIONS: The model provided means for storing and updating patients' data directly on the interface which allowed the results to be readily available for the health care providers. The developed system will improve clinical communication and information exchange. Moreover, it can manage efforts by generating correlated and coherent summaries of cancer diagnoses.

  2. Beatson SA, Ben Zakour NL, Totsika M, Forde BM, Watts RE, Mabbett AN, et al.
    Infect Immun, 2015 May;83(5):1749-64.
    PMID: 25667270 DOI: 10.1128/IAI.02810-14
    Urinary tract infections (UTIs) are among the most common infectious diseases of humans, with Escherichia coli responsible for >80% of all cases. One extreme of UTI is asymptomatic bacteriuria (ABU), which occurs as an asymptomatic carrier state that resembles commensalism. To understand the evolution and molecular mechanisms that underpin ABU, the genome of the ABU E. coli strain VR50 was sequenced. Analysis of the complete genome indicated that it most resembles E. coli K-12, with the addition of a 94-kb genomic island (GI-VR50-pheV), eight prophages, and multiple plasmids. GI-VR50-pheV has a mosaic structure and contains genes encoding a number of UTI-associated virulence factors, namely, Afa (afimbrial adhesin), two autotransporter proteins (Ag43 and Sat), and aerobactin. We demonstrated that the presence of this island in VR50 confers its ability to colonize the murine bladder, as a VR50 mutant with GI-VR50-pheV deleted was attenuated in a mouse model of UTI in vivo. We established that Afa is the island-encoded factor responsible for this phenotype using two independent deletion (Afa operon and AfaE adhesin) mutants. E. coli VR50afa and VR50afaE displayed significantly decreased ability to adhere to human bladder epithelial cells. In the mouse model of UTI, VR50afa and VR50afaE displayed reduced bladder colonization compared to wild-type VR50, similar to the colonization level of the GI-VR50-pheV mutant. Our study suggests that E. coli VR50 is a commensal-like strain that has acquired fitness factors that facilitate colonization of the human bladder.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links