Displaying publications 1 - 20 of 753 in total

Abstract:
Sort:
  1. Adeshina AM, Hashim R
    Interdiscip Sci, 2017 Mar;9(1):140-152.
    PMID: 26754740 DOI: 10.1007/s12539-015-0140-9
    Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also successfully encrypts and decrypts textual data in Microsoft Word document, Microsoft Excel and Portable Document Formats which are the conventional format of documenting medical records. Interestingly, the entire encryption and decryption procedures were achieved at a lower computational cost using regular hardware and software resources without compromising neither the quality of the decrypted data nor the security level of the algorithms.
    Matched MeSH terms: Software*
  2. Elhag AA, Mohamad R, Aziz MW, Zeshan F
    PLoS One, 2015;10(4):e0123086.
    PMID: 25928358 DOI: 10.1371/journal.pone.0123086
    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.
    Matched MeSH terms: Software*
  3. Doshi HK
    Asia Pac Fam Med, 2003;2(4):193-195.
    Developing a software program to manage data in a general practice setting is complicated. Vision Integrated Medical System is an example of a integrated management system that was developed by general practitioners, within a general practice, to offer a user friendly system with multi tasking capabilities. The present report highlights the reasons behind the development of this system and how it can assist day to day practice.
    Matched MeSH terms: Software
  4. Rahim LA, Kudiri KM, Bahattacharjee S
    PLoS One, 2019;14(5):e0214044.
    PMID: 31120878 DOI: 10.1371/journal.pone.0214044
    The parallelisation of big data is emerging as an important framework for large-scale parallel data applications such as seismic data processing. The field of seismic data is so large or complex that traditional data processing software is incapable of dealing with it. For example, the implementation of parallel processing in seismic applications to improve the processing speed is complex in nature. To overcome this issue, a simple technique which that helps provide parallel processing for big data applications such as seismic algorithms is needed. In our framework, we used the Apache Hadoop with its MapReduce function. All experiments were conducted on the RedHat CentOS platform. Finally, we studied the bottlenecks and improved the overall performance of the system for seismic algorithms (stochastic inversion).
    Matched MeSH terms: Software
  5. Dabbagh M, Lee SP
    ScientificWorldJournal, 2014;2014:737626.
    PMID: 24982987 DOI: 10.1155/2014/737626
    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
    Matched MeSH terms: Software*
  6. Lim KS, Buyamin S, Ahmad A, Shapiai MI, Naim F, Mubin M, et al.
    ScientificWorldJournal, 2014;2014:364179.
    PMID: 24883386 DOI: 10.1155/2014/364179
    The vector evaluated particle swarm optimisation (VEPSO) algorithm was previously improved by incorporating nondominated solutions for solving multiobjective optimisation problems. However, the obtained solutions did not converge close to the Pareto front and also did not distribute evenly over the Pareto front. Therefore, in this study, the concept of multiple nondominated leaders is incorporated to further improve the VEPSO algorithm. Hence, multiple nondominated solutions that are best at a respective objective function are used to guide particles in finding optimal solutions. The improved VEPSO is measured by the number of nondominated solutions found, generational distance, spread, and hypervolume. The results from the conducted experiments show that the proposed VEPSO significantly improved the existing VEPSO algorithms.
    Matched MeSH terms: Software*
  7. Almogahed A, Mahdin H, Omar M, Zakaria NH, Gu YH, Al-Masni MA, et al.
    PLoS One, 2023;18(11):e0293742.
    PMID: 37917752 DOI: 10.1371/journal.pone.0293742
    Refactoring, a widely adopted technique, has proven effective in facilitating and reducing maintenance activities and costs. Nonetheless, the effects of applying refactoring techniques on software quality exhibit inconsistencies and contradictions, leading to conflicting evidence on their overall benefit. Consequently, software developers face challenges in leveraging these techniques to improve software quality. Moreover, the absence of a categorization model hampers developers' ability to decide the most suitable refactoring techniques for improving software quality, considering specific design goals. Thus, this study aims to propose a novel refactoring categorization model that categorizes techniques based on their measurable impacts on internal quality attributes. Initially, the most common refactoring techniques used by software practitioners were identified. Subsequently, an experimental study was conducted using five case studies to measure the impacts of refactoring techniques on internal quality attributes. A subsequent multi-case analysis was conducted to analyze these effects across the case studies. The proposed model was developed based on the experimental study results and the subsequent multi-case analysis. The model categorizes refactoring techniques into green, yellow, and red categories. The proposed model, by acting as a guideline, assists developers in understanding the effects of each refactoring technique on quality attributes, allowing them to select appropriate techniques to improve specific quality attributes. Compared to existing studies, the proposed model emerges superior by offering a more granular categorization (green, yellow, and red categories), and its range is wide (including ten refactoring techniques and eleven internal quality attributes). Such granularity not only equips developers with an in-depth understanding of each technique's impact but also fosters informed decision-making. In addition, the proposed model outperforms current studies and offers a more nuanced understanding, explicitly highlighting areas of strength and concern for each refactoring technique. This enhancement aids developers in better grasping the implications of each refactoring technique on quality attributes. As a result, the model simplifies the decision-making process for developers, saving time and effort that would otherwise be spent weighing the benefits and drawbacks of various refactoring techniques. Furthermore, it has the potential to help reduce maintenance activities and associated costs.
    Matched MeSH terms: Software*
  8. Karimi A, Zarafshan F, Al-Haddad SA, Ramli AR
    ScientificWorldJournal, 2014;2014:672832.
    PMID: 25386613 DOI: 10.1155/2014/672832
    Voting is an important operation in multichannel computation paradigm and realization of ultrareliable and real-time control systems that arbitrates among the results of N redundant variants. These systems include N-modular redundant (NMR) hardware systems and diversely designed software systems based on N-version programming (NVP). Depending on the characteristics of the application and the type of selected voter, the voting algorithms can be implemented for either hardware or software systems. In this paper, a novel voting algorithm is introduced for real-time fault-tolerant control systems, appropriate for applications in which N is large. Then, its behavior has been software implemented in different scenarios of error-injection on the system inputs. The results of analyzed evaluations through plots and statistical computations have demonstrated that this novel algorithm does not have the limitations of some popular voting algorithms such as median and weighted; moreover, it is able to significantly increase the reliability and availability of the system in the best case to 2489.7% and 626.74%, respectively, and in the worst case to 3.84% and 1.55%, respectively.
    Matched MeSH terms: Software*; Software Design
  9. Magableh A, Shukur Z, Ali NM
    ScientificWorldJournal, 2014;2014:327808.
    PMID: 25136656 DOI: 10.1155/2014/327808
    Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
    Matched MeSH terms: Software*; Software Design
  10. Lum KY, Lindén M, Tan TS
    Stud Health Technol Inform, 2015;211:225-32.
    PMID: 25980873
    For medical application, the efficiency and transmission distance of the wireless power transfer (WPT) are always the main concern. Research has been showing that the impedance matching is one of the critical factors for dealing with the problem. However, there is not much work performed taking both the source and load sides into consideration. Both sides matching is crucial in achieving an optimum overall performance, and the present work proposes a circuit model analysis for design and implementation. The proposed technique was validated against experiment and software simulation. Result was showing an improvement in transmission distance up to 6 times, and efficiency at this transmission distance had been improved up to 7 times as compared to the impedance mismatch system. The system had demonstrated a near-constant transfer efficiency for an operating range of 2cm-12cm.
    Matched MeSH terms: Software
  11. Perumal L
    Heliyon, 2019 Aug;5(8):e02319.
    PMID: 31517093 DOI: 10.1016/j.heliyon.2019.e02319
    New techniques are presented for Delaunay triangular mesh generation and element optimisation. Sample points for triangulation are generated through mapping (a new approach). These sample points are later triangulated by the conventional Delaunay method. Resulting triangular elements are optimised by addition, removal and relocation of mapped sample points (element nodes). The proposed techniques (generation of sample points through mapping for Delaunay triangulation and mesh optimisation) are demonstrated by using Mathematica software. Simulation results show that the proposed techniques are able to form meshes that consist of triangular elements with aspect ratio of less than 2 and minimum skewness of more than 45°.
    Matched MeSH terms: Software
  12. Hassan MM, Tan IKT, Yap TTV
    Data Brief, 2019 Dec;27:104736.
    PMID: 31788509 DOI: 10.1016/j.dib.2019.104736
    The Internet Engineering Task Force provides a network-based mobility management solution to execute handover in heterogeneous networks on network-side called Proxy Mobile IPv6 (PMIPv6). In this data article, data are presented during the horizontal and vertical handover on video communication in PMIPv6 mobility protocols. The handover data are gathered using several measurement factors, which are latency, jitter, cumulative measured, and peak signal noise ratio under network simulation software, for both horizontal and vertical handovers [8].
    Matched MeSH terms: Software
  13. Supian Samat
    A description is given of the numerical integration method for the calculation of the mean kidney dose for a Co-57 external radiation source. Based on this theory, a computer program was written. Initial calculation of the kidney volume shows that the method has a good accuracy. For the mean kidney dose, this method gives a satisfactory result, since the calculated value lies within the acceptable range of the central axis depth dose.
    Satu huraian diberikan tentang kaedah pengkamiran berangka untuk mengira dos buah pinggang purata untuk satu sumber sinaran luar Co-57. Berdasarkan teori ini, satu program komputer ditulis. Pengiraan awal isipadu buah pinggang menunjukkan yang kaedah ini mempunyai ketepatan yang baik. Untuk dos buah pinggang purata, kaedah ini memberikan keputusan yang baik, kerana nilai kiraan terletak diantara julat dos kedalaman paksi pusat yang diterima.
    Matched MeSH terms: Software
  14. Elayaraja Aruchunan, Mohana Sundaram Muthuvalu, Jumat Sulaiman
    Sains Malaysiana, 2015;44:139-146.
    In this paper, we have examined the effectiveness of the quarter-sweep iteration concept on conjugate gradient normal residual (CGNR) iterative method by using composite Simpson's (CS) and finite difference (FD) discretization schemes in solving Fredholm integro-differential equations. For comparison purposes, Gauss- Seidel (GS) and the standard or full- and half-sweep CGNR methods namely FSCGNR and HSCGNR are also presented. To validate the efficacy of the proposed method, several analyses were carried out such as computational complexity and percentage reduction on the proposed and existing methods.
    Matched MeSH terms: Software
  15. Hannan MA, Hussain A, Samad SA
    Sensors (Basel), 2010;10(2):1141-53.
    PMID: 22205861 DOI: 10.3390/s100201141
    This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS) that includes an airbag deployment decision system (ADDS) and a tire pressure monitoring system (TPMS). A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications.
    Matched MeSH terms: Software*
  16. Cheong WH, Tan YC, Yap SJ, Ng KP
    Bioinformatics, 2015 Nov 15;31(22):3685-7.
    PMID: 26227146 DOI: 10.1093/bioinformatics/btv433
    : We present ClicO Free Service, an online web-service based on Circos, which provides a user-friendly, interactive web-based interface with configurable features to generate Circos circular plots.
    Matched MeSH terms: Software*
  17. Manley S
    Account Res, 2023 May;30(4):219-245.
    PMID: 34569370 DOI: 10.1080/08989621.2021.1986018
    Popular text-matching software generates a percentage of similarity - called a "similarity score" or "Similarity Index" - that quantifies the matching text between a particular manuscript and content in the software's archives, on the Internet and in electronic databases. Many evaluators rely on these simple figures as a proxy for plagiarism and thus avoid the burdensome task of inspecting the longer, detailed Similarity Reports. Yet similarity scores, though alluringly straightforward, are never enough to judge the presence (or absence) of plagiarism. Ideally, evaluators should always examine the Similarity Reports. Given the persistent use of simplistic similarity score thresholds at some academic journals and educational institutions, however, and the time that can be saved by relying on the scores, a method is arguably needed that encourages examining the Similarity Reports but still also allows evaluators to rely on the scores in some instances. This article proposes a four-band method to accomplish this. Used together, the bands oblige evaluators to acknowledge the risk of relying on the similarity scores yet still allow them to ultimately determine whether they wish to accept that risk. The bands - for most rigor, high rigor, moderate rigor and less rigor - should be tailored to an evaluator's particular needs.
    Matched MeSH terms: Software*
  18. Buurman J, Zhang S, Babovic V
    Risk Anal, 2009 Mar;29(3):366-79.
    PMID: 19076327 DOI: 10.1111/j.1539-6924.2008.01160.x
    Complex engineering systems are usually designed to last for many years. Such systems will face many uncertainties in the future. Hence the design and deployment of these systems should not be based on a single scenario, but should incorporate flexibility. Flexibility can be incorporated in system architectures in the form of options that can be exercised in the future when new information is available. Incorporating flexibility comes, however, at a cost. To evaluate if this cost is worth the investment a real options analysis can be carried out. This approach is demonstrated through analysis of a case study of a previously developed static system-of-systems for maritime domain protection in the Straits of Malacca. This article presents a framework for dynamic strategic planning of engineering systems using real options analysis and demonstrates that flexibility adds considerable value over a static design. In addition to this it is shown that Monte Carlo analysis and genetic algorithms can be successfully combined to find solutions in a case with a very large number of possible futures and system designs.
    Matched MeSH terms: Software Design*
  19. Muthusamy H, Polat K, Yaacob S
    PLoS One, 2015;10(3):e0120344.
    PMID: 25799141 DOI: 10.1371/journal.pone.0120344
    In the recent years, many research works have been published using speech related features for speech emotion recognition, however, recent studies show that there is a strong correlation between emotional states and glottal features. In this work, Mel-frequency cepstralcoefficients (MFCCs), linear predictive cepstral coefficients (LPCCs), perceptual linear predictive (PLP) features, gammatone filter outputs, timbral texture features, stationary wavelet transform based timbral texture features and relative wavelet packet energy and entropy features were extracted from the emotional speech (ES) signals and its glottal waveforms(GW). Particle swarm optimization based clustering (PSOC) and wrapper based particle swarm optimization (WPSO) were proposed to enhance the discerning ability of the features and to select the discriminating features respectively. Three different emotional speech databases were utilized to gauge the proposed method. Extreme learning machine (ELM) was employed to classify the different types of emotions. Different experiments were conducted and the results show that the proposed method significantly improves the speech emotion recognition performance compared to previous works published in the literature.
    Matched MeSH terms: Speech Recognition Software*
  20. Darmini, Prastanti AD, Daryati S, Kartikasari Y, Sulistiyadi AH, Setiawan DA
    Med J Malaysia, 2023 Dec;78(7):865-869.
    PMID: 38159919
    INTRODUCTION: There are two data acquisition methods for computed tomography (CT) scans, namely sequence and helical. Each of them has two ways of measuring the volume of bleeding in a head CT scan, namely by manual and automatic methods. So, it is necessary to have an analysis for measurement accuracy with these two methods in two data acquisitions. The purpose of this study was to compare and evaluate bleeding volumetric measurement accuracy of sequence and helical on head CT acquisition using manual and automatic methods.

    MATERIALS AND METHODS: This is quantitative research with a true experimental approach. Actual bleeding volume was simulated by an acrylic phantom containing Iodine contrast media (5 ml, 10 ml, 15 ml, and 20 ml). The phantom was scanned using routine CT protocol using the helical and sequence technique. Bleeding volume from each technique was measured manually using the Broderick formula and automatic software (ROI based). Accuracy was assessed by comparing the volume measurement result to the actual bleeding volume. Data was analysed using the Friedman test and by Wilcoxon.

    RESULTS: The standard deviation of measured bleeding volume from the manual and automatic measurements compared to the actual bleeding volume were (0.220; 0.236; 0.351; 0.057) and (0.139; 0.270; 0.315; 0.329) in helical technique, and (0.333; 0.376; 0.447; 0.476) and (0.139; 0.242; 0.288; 0,376) in sequence technique. There are differences in the measurement results from the helical and sequence techniques (p <0.05) and using manual and automatic methods (p <0.05).

    CONCLUSION: The measurement of bleeding volume that has a standard deviation value compared to the actual volume is more accurate in the helical technique using the automatic method, while the sequence technique is the manual method.

    Matched MeSH terms: Software*
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links