Selected landmarks from each of 47 maxillary dental casts were used to define a Cartesian-coordinate system from which the positions of selected teeth were determined on standardized digital images. The position of the i-th tooth was defined by a line of length (l(i)) joining the tooth to the origin, and the angle (θ(i)) of this line to the horizontal Cartesian axis. Four teeth, the central incisor, lateral incisor, canine and first molar were selected and their position were collectively used to represent the shape of the dental arch. A pilot study using clustering and principal component analysis strongly suggest the existence of 3 groups of arch shape. In this study, the homogeneity of the 3 groups was further investigated and confirmed by the Dunn and Davies-Bouldein validity indices. This is followed by an investigation of the probability distribution of these 3 groups. The main result of this study suggests 3 groups of multivariate (MV) normal distribution. The MV normal probability distribution of these groups may be used in further studies to investigate the issues of variation of arch shape, which is fundamental to the practice of prosthodontics and orthodontics.
In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive,however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD-the deterministic and probabilistic approaches-have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. Forthe second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then,maximum likelihood is applied for position smoothing while a Bayesian approach is appliedfor size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement.
Solar thermal system or solar water heater system is one of the applications used to produce hot water in the residential sector. This paper describes HAZOP analysis and reliability assessment to evaluates the potential hazard and system probability for the closed loop solar thermal system applied for the residential area. Hazard identification for the main system components is analyzed while Fault Tree Analysis (FTA), Reliability Block Diagram (RBD) and Weibull distributions performed to determine the reliability for the overall system. The result shows that there are 49 potential hazards for the system with failure probability at 0.23822 and the reliability is 0.9693. Subsequently, this study determined the potential hazards for the system which can be anticipated by the residential consumer for the safety aspect. Furthermore, the evaluated reliability result shows that the application of closed loop type solar water heater system at residential premises is highly recommended due to its long lasting operational condition.
A noticeable increase in drought frequency and severity has been observed across the globe due to climate change, which attracted scientists in development of drought prediction models for mitigation of impacts. Droughts are usually monitored using drought indices (DIs), most of which are probabilistic and therefore, highly stochastic and non-linear. The current research investigated the capability of different versions of relatively well-explored machine learning (ML) models including random forest (RF), minimum probability machine regression (MPMR), M5 Tree (M5tree), extreme learning machine (ELM) and online sequential-ELM (OSELM) in predicting the most widely used DI known as standardized precipitation index (SPI) at multiple month horizons (i.e., 1, 3, 6 and 12). Models were developed using monthly rainfall data for the period of 1949-2013 at four meteorological stations namely, Barisal, Bogra, Faridpur and Mymensingh, each representing a geographical region of Bangladesh which frequently experiences droughts. The model inputs were decided based on correlation statistics and the prediction capability was evaluated using several statistical metrics including mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), correlation coefficient (R), Willmott's Index of agreement (WI), Nash Sutcliffe efficiency (NSE), and Legates and McCabe Index (LM). The results revealed that the proposed models are reliable and robust in predicting droughts in the region. Comparison of the models revealed ELM as the best model in forecasting droughts with minimal RMSE in the range of 0.07-0.85, 0.08-0.76, 0.062-0.80 and 0.042-0.605 for Barisal, Bogra, Faridpur and Mymensingh, respectively for all the SPI scales except one-month SPI for which the RF showed the best performance with minimal RMSE of 0.57, 0.45, 0.59 and 0.42, respectively.
The massive growth of mobile users will spread to significant numbers of small cells for the Fifth Generation (5G) mobile network, which will overlap the fourth generation (4G) network. A tremendous increase in handover (HO) scenarios and HO rates will occur. Ensuring stable and reliable connection through the mobility of user equipment (UE) will become a major problem in future mobile networks. This problem will be magnified with the use of suboptimal handover control parameter (HCP) settings, which can be configured manually or automatically. Therefore, the aim of this study is to investigate the impact of different HCP settings on the performance of 5G network. Several system scenarios are proposed and investigated based on different HCP settings and mobile speed scenarios. The different mobile speeds are expected to demonstrate the influence of many proposed system scenarios on 5G network execution. We conducted simulations utilizing MATLAB software and its related tools. Evaluation comparisons were performed in terms of handover probability (HOP), ping-pong handover probability (PPHP) and outage probability (OP). The 5G network framework has been employed to evaluate the proposed system scenarios used. The simulation results reveal that there is a trade-off in the results obtained from various systems. The use of lower HCP settings provides noticeable enhancements compared to higher HCP settings in terms of OP. Simultaneously, the use of lower HCP settings provides noticeable drawbacks compared to higher HCP settings in terms of high PPHP for all scenarios of mobile speed. The simulation results show that medium HCP settings may be the acceptable solution if one of these systems is applied. This study emphasises the application of automatic self-optimisation (ASO) functions as the best solution that considers user experience.
Hydraulic modeling of a foul sewer system (FSS) enables a better understanding of the behavior of the system and its effective management. However, there is generally a lack of sufficient field measurement data for FSS model development due to the low number of in-situ sensors for data collection. To this end, this study proposes a new method to develop FSS models based on geotagged information and water consumption data from smart water meters that are readily available. Within the proposed method, each sewer manhole is firstly associated with a particular population whose size is estimated from geotagged data. Subsequently, a two-stage optimization framework is developed to identify daily time-series inflows for each manhole based on physical connections between manholes and population as well as sewer sensor observations. Finally, a new uncertainty analysis method is developed by mapping the probability distributions of water consumption captured by smart meters to the stochastic variations of wastewater discharges. Two real-world FSSs are used to demonstrate the effectiveness of the proposed method. Results show that the proposed method can significantly outperform the traditional FSS model development approach in accurately simulating the values and uncertainty ranges of FSS hydraulic variables (manhole water depths and sewer flows). The proposed method is promising due to the easy availability of geotagged information as well as water consumption data from smart water meters in near future.
Consumers are reported to be increasingly concerned about their health. Nonetheless, consumers show different attitudes toward food at home and away from home. In particular, consumers tend to shy away from healthy food items when dining on special occasions. This study is the first to look into the number of healthy menu items provided to consumers during dining occasions. The impacts of two independent variables (dining occasion: normal vs. special; number of healthy items: limited vs. extended) on consumers’ dining menu selection was examined among female university students. The results of this study indicate that both dining occasion and the number of healthy items offered could influence consumers’ food selection independently. Although consumers are more likely to choose unhealthy items while dining’on special occasions, offering more healthy items would increase the probability of healthy eating. This study also offers some insights into the food categories and cooking methods favored by consumers. Further studies should explore other potential foods that would enhance the selection of healthy options by consumers.
Analysis of count event data such as mortality cases, were often modelled using Poisson regression model. Maximum likelihood procedures were used by using SAS software to estimate the model parameters of a Poisson regression model. However, the Negative Binomial distribution has been widely suggested as the alternative to the Poisson when there is proof of overdispersion phenomenon. We modelled the mortality cases as the dependent variable using Poisson and Negative Binomial regression and compare both of the models. The procedures were done in SAS by using the function PROC GENMOD. The results showed that the mortality data in Poisson regression exhibit large ratio values between deviance to degree of freedom which indicate model misspecification or overdispersion. This large ratio was found to be reduced in Negative Binomial regression. The Normal probability plot of Pearson residual confirmed that the Negative Binomial regression is a better model than Poisson regression in modelling the mortality data. The objective of this study is to compare the goodness of fit of Poisson regression model and Negative Binomial regression model in the application of air pollution epidemiologic time series study by using SAS software.
Work shift has been shown to correlate with accident rates. Understanding of such correlation is pertinent especially among emergency response personnel since the decisions that they make determines not only the outcome of their responses but also the risks of accidents to themselves. A questionnaire data derived study used together with a semi quantitative risk analysis method was adopted to estimate the levels of accident risks between firefighters working on two work shifts. Two hundred and forty eight Malaysia’s Fire and Rescue Department firemen from 24 fire stations working on shifts were selected as respondents. The accident rate among firefighters in year 2006 was 52.8 %. Results showed that the Accident Risk Index (ARI) among firefighters working the 24 hours shift was higher (ARI = 3.14) compared with those in the 12 hours shift (ARI = 2.98). However, there were no significant difference in overall severity of the accidents between the two shifts (p>0.05). The difference in risk levels was attributed to the difference in the likelihood of accident occurrence.
The problem of difficulty in obtaining cloud-free scene at the equatorial region from satellite platforms can be
overcome by using airborne imagery as an attempt for introducing an economical method of remote sensing
data; which only requires a digital camera to provide near time data. Forty three digital images were captured
using a high resolution digital camera model pentax optio A40 (12 megapixels)at a selected location in the same day in Penang Island from a low-altitude flying autopilot aircraft (CropCam) to generate land use/land cover (LULC) map of the test area. The CropCam was flown at an average altitude of 320 meters over the ground while capturing images which were taken during two flying missions for the duration of approximately 15 and 20 minutes respectively. The CropCam was equipped with a digital camera as a sensor to capture the GPS points based digital images according to the present time to ensure the mosaic of the digital images. Forty one images were used in providing a mosaic image of a bigger coverage of area (full panorama). Training samples were collected simultaneously when the CropCam captured the images by using hand held GPS. Supervised classification techniques, such as the maximum likelihood, minimum-to-distance, and parallelepiped were applied to the panoramic image to generate LULC map for the study area. It was found that the maximum likelihood classifier produce superior results and achieved a high degree of accuracy. The results indicated that the CropCam equipped with a high resolution digital camera can be useful and suitable tool for the tropical region, and this technique could reduce the cost and time of acquiring images for LULC mapping.
Microwave Remote sensing data have been widely used in land cover and land use classification. The objective of this research paper is to investigate the feasibility of the multi-polarized ALOS-PALSAR data for land cover mapping. This paper presents the methodology and preliminary result including data acquisitions, data processing and data analysis. Standard supervised classification techniques such as the maximum likelihood, minimum distance-to-mean, and parallelepiped were applied to the ALOS-PALSAR images in the land cover mapping analysis. The PALSAR data training areas were chosen based on the information obtained from
optical satellite imagery. The best supervise classifier was selected based on the highest overall accuracy and
kappa coefficient. This study indicated that the land cover of Butterworth, Malaysia can be mapped accurately
using ALOS PALSAR data.
The interference of 235 U on 226 Ra concentration measured directly using the γ-ray energy of 186 keV and the interference of 228 Ac on the 40 K analysis by gamma-spectrometry system were highlighted and discussed. The interference of 235 U was demonstrated to be very significant, i.e. 45% of the 226 Ra concentration measured directly at 186 keV in natural samples containing uranium series in equilibrium. The interference of 228 Ac on 40 K concentration was particularly significant for samples containing high concentration of 228 Ac ( 228 Ra) such as radioactive minerals. Another important aspect discussed is the assignment of the right emission probability of the 583 keV and 2614 keV of the 208 Tl for the purpose of estimating the concentration of 232 Th or other radionuclides in the thorium series. Extra cautions are required in the interpretation of the measured 208 Tl concentration in samples of various natures. It is suggested that the emission probability used for 208 Tl be reported for comparison and verification.
This study aimed to validate the Malay Version of Copenhagen Psychosocial Questionnaire for Malaysian use and application for assessing psychosocial work environment factors. Validity and Reliability were studied in 50 staff nurses of Hospital Selayang. The validity of the questionnaire was evaluated by calculating the percentage of sensitivity and specificity at the different score level. Both percentage of sensitivity against specificity were plotted to produce a ROC (Receiver Operating Characteristics) curve, and score 52 has the highest both sensitivity and specificity was used as an overall index that expresses the probability that measure the psychosocial problems. For reliability purposes, a descriptive of Test-Retest Mean Scores and Paired Sample T-Test and the coefficient-correlation test were calculated. The Test-Retest Mean Scores and Paired Sample T-Test for all 26 scales were calculated and showed statistically not significant. The reliability of the questionnaire and its 26 scales was assessed by using Pearson (r) (overall questionnaire r within a range of 0.00 to 1.00). The COPSOQ appears to be a reliable and responsive measure of workers for Malaysian use and can be applied for assessing psychosocial work environment factors.
Simultaneous removal of SO2 and NO from simulated flue gas by cerium oxide supported over palm shell activated carbon (Ce/PSAC) was studied in a fixed bed adsorber. In this study, the adsorption breakthrough of SO2 and NO on Ce/PSAC at different reaction temperatures was manipulated to test their applicability to a model developed by Yoon and Nelson (1984) for breakthrough curves. Yoon and Nelson (1984) developed a relatively simple model addressing the adsorption and breakthrough of adsorbate vapour with respect to activated charcoal. This model was based on the assumption that the rate of decrease in the probability of adsorption for each adsorbate molecule is proportional to the probability of adsorbate adsorption and the probability of adsorbate breakthrough on the adsorbent. A regression analysis (least square method) has been used to give the model parameters of k and t1/2. The results showed that the agreement between the model and the experimental results is satisfactory. From the observation, it is concluded that the simple two-parameter model of Yoon and Nelson’s model can be applied for modelling the breakthrough curves of SO2 and NO gas adsorption over Ce/PSAC.
DNA fingerprinting, also known as DNA profiling, serves as a standard procedure in forensics to identify a person by the short tandem repeat (STR) loci in their DNA. By comparing the STR loci between DNA samples, practitioners can calculate a probability of match to identity the contributors of a DNA mixture. Most existing methods are based on 13 core STR loci which were identified by the Federal Bureau of Investigation (FBI). Analyses based on these loci of DNA mixture for forensic purposes are highly variable in procedures, and suffer from subjectivity as well as bias in complex mixture interpretation. With the emergence of next-generation sequencing (NGS) technologies, the sequencing of billions of DNA molecules can be parallelized, thus greatly increasing throughput and reducing the associated costs. This allows the creation of new techniques that incorporate more loci to enable complex mixture interpretation. In this paper, we propose a computation for likelihood ratio that uses NGS (next generation sequencing) data for DNA testing on mixed samples. We have applied the method to 4480 simulated DNA mixtures, which consist of various mixture proportions of 8 unrelated whole-genome sequencing data. The results confirm the feasibility of utilizing NGS data in DNA mixture interpretations. We observed an average likelihood ratio as high as 285,978 for two-person mixtures. Using our method, all 224 identity tests for two-person mixtures and three-person mixtures were correctly identified.
The probability of the construction accident to happen is high due the nature of
Construction work that involves complex activities, methods, machineries, materials
and hazards. The occupational safety and health (OSH) law and regulations are
mandatory for every construction project to uphold. Responsibilities to ensure the
safety and health at the workplace lies with those who create the risk and with those
who work with the risk. The owner or client of the construction project has the upper
hand in determining the standard of OSH implementation in their project through
contract documents. If the contract documents comprehensively spell out OSH
requirements and cover all OSH cost, then the issues of contractor not implementing
OSH measures could be minimized. The objective of this study is to identify
Occupational Safety and Health requirements (OSH) in the contract document of
selected construction projects. To achieve this objective, a total of seven contract
document was collected from several construction companies. The qualitative analysis
was performed to identify the extent of OSH requirements and costs are being
mentioned in the contract documents. The finding shows that most of the contract
document contains very little emphasis on OSH requirements and budgeting. Only one
contract contains, an appendix that spell out about the safe work practices for
construction works. The visible allocated budget for OSH requirements for all seven
contracts is very minute range from 0.21% to 1.99% of contract value. In order to
ensure that occupational safety and health is properly implemented, safety needs must
be included in the budget because implementation it is not free, this can be achieved
by making it a permanent feature in all bills of quantity of the project.
In recent years, vegetable oil such as Palm Oil (PO) has been identified as a potential alternative dielectric insulating fluid for transformers. It is biodegradable, non-toxic and has high flash and fire points. In this paper, a study on the positive lightning impulse breakdown voltages of PO under non-uniform field is carried out. The testing was carried out using needle-plane electrodes configuration at gap distances of 25 mm and 50 mm. Rising voltage, 1 and 3 shots per step testing methods were used and 3 types of Refined Bleach and Deodorized Palm Oil (RBDPO) and Mineral Oil (MO) were examined. It was found there is no significant effect on the breakdown voltages of all samples. The breakdown voltages of all RBDPO at 50% probability are comparable with MO. At 1% probability and gap distance of 50 mm, the breakdown voltages of all RBDPO are lower than MO.
The grounding system of a lightning protection scheme is designed basically to avoid arcing and
dangerous step potentials. The grounding impedance of the system varies depending on soil structure and frequency. This paper describes the effect of harmonic impedance (also called frequency dependence of soil) on potential distribution under lightning strike to a metal tower with single grounding path, for different soil types. The results show that the peak value of ground potential rise (GPR) and step voltage (SP) may reach extremely hazardous values even at distances in the order of 90 m from the tower footing, especially when soil resistivity is high. Hence, we emphasise that, in contrast to power grounding, when designing of grounding systems that are meant to handle transient or high frequency currents as well, the frequency dependent soil parameters should be considered to avoid hazardous situations, especially at locations with a high probability of lightning strikes such as metal towers.
Substitutional clusters of multiple light element dopants are a promising route to the elusive shallow donor in diamond. To understand the behaviour of co-dopants, this report presents an extensive first principles study of possible clusters of boron and nitrogen. We use periodic hybrid density functional calculations to predict the geometry, stability and electronic excitation energies of a range of clusters containing up to five N and/or B atoms. Excitation energies from hybrid calculations are compared to those from the empirical marker method, and are in good agreement.
When a boron-rich or nitrogen-rich cluster consists of 3 - 5 atoms, the minority dopant element - a nitrogen or boron atom respectively - can be in either a central or peripheral position. We find B-rich clusters are most stable when N sits centrally, whereas N-rich clusters are most stable with B in a peripheral position. In the former case, excitation energies mimic those of the single boron acceptor, while the latter produce deep levels in the band-gap. Implications for probable clusters that would arise in high-pressure high-temperature (HPHT) co-doped diamond and their properties are discussed.
The quantification of microplastics in environmental samples often requires an observer to determine whether a particle is plastic or non-plastic, prior to further verification procedures. This implies that inconspicuous microplastics with a low natural detection may be underestimated. The present study aimed at assessing this underestimation, looking at how colour (white, green and blue), size (large; ~1000 μm and small; <400 μm) and grain size fraction may affect detection. Sediment treatments varying in grain size were inoculated with known quantities of low-density polyethylene microbeads extracted from commercially bought facial scrubs. These microbeads varied in colour and size. Once extracted using a density separation method microbeads were counted. An overall underestimation of 78.59% may be a result of observer error and/or technical error. More specifically, the results suggested that microbeads varying in colour and size have a different detection probability and that these microbead features are more important in underestimation likelihoods than grain sizes.