889 resultados para SOCIETY CLASSIFICATION CRITERIA


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomics and genetic findings have been hailed with promises of unlocked codes and new frontiers of personalized medicine. Despite cautions about gene hype, the strong cultural pull of genes and genomics has allowed consideration of genomic personhood. Populated by the complicated records of mass spectrometer, proteomics, which studies the human protein, has not achieved either the funding or the popular cultural appeal proteomics scientists had hoped it would. While proteomics, being focused on the proteins that actually indicate and create disease states, has a more direct potential for clinical applications than genomic risk predictions, culturally, it has not provided the material for identity creation. In our ethnographic research, we explore how proteomic scientists attempting to shape an appeal to personhood through which legitimacy may be defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next Generation Sequencing (NGS) has revolutionised molec- ular biology, allowing routine clinical sequencing. NGS data consists of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. The common bacterium Staphylococcus aureus may cause severe and life-threatening infections in humans, with some strains exhibiting antibiotic resistance. Here we apply an SVM classifier to the important problem of distinguishing S. aureus sequencing projects from other pathogens, including closely related Staphylococci. Using a sequence k-mer representation, we achieve precision and recall above 95%, implicating features with important functional associations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, a tandem LC-MS (Waters Xevo TQ) MRM-based MS method was developed for rapid, broad profiling of hydrophilic metabolites from biological samples, in either positive or negative ion modes without the need for an ion pairing reagent, using a reversed-phase pentafluorophenylpropyl (PFPP) column. The developed method was successfully applied to analyze various biological samples from C57BL/6 mice, including urine, duodenum, liver, plasma, kidney, heart, and skeletal muscle. As result, a total 112 of hydrophilic metabolites were detected within 8 min of running time to obtain a metabolite profile of the biological samples. The analysis of this number of hydrophilic metabolites is significantly faster than previous studies. Classification separation for metabolites from different tissues was globally analyzed by PCA, PLS-DA and HCA biostatistical methods. Overall, most of the hydrophilic metabolites were found to have a "fingerprint" characteristic of tissue dependency. In general, a higher level of most metabolites was found in urine, duodenum, and kidney. Altogether, these results suggest that this method has potential application for targeted metabolomic analyzes of hydrophilic metabolites in a wide ranges of biological samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, problems are described which are related to the ergonomic assessment of vehicle package design in vehicle systems engineering. The traditional approach, using questionnaire techniques for a subjective assessment of comfort related to package design, is compared to a biomechanical approach. An example is given for ingress design. The biomechanical approach is based upon objective postural data. The experimental setup for the study is described and methods used for the biomechanical analysis are explained. Because the biomechanic assessment requires not only a complex experimental setup but also time consuming data processing, a systematic reduction and preparation of biomechanic data for classification with an Artificial Neural Network significantly improves the economy of the biomechanical method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cardiomyopathies represent a group of diseases of the myocardium of the heart and include diseases both primarily of the cardiac muscle and systemic diseases leading to adverse effects on the heart muscle size, shape, and function. Traditionally cardiomyopathies were defined according to phenotypical appearance. Now, as our understanding of the pathophysiology of the different entities classified under each of the different phenotypes improves and our knowledge of the molecular and genetic basis for these entities progresses, the traditional classifications seem oversimplistic and do not reflect current understanding of this myriad of diseases and disease processes. Although our knowledge of the exact basis of many of the disease processes of cardiomyopathies is still in its infancy, it is important to have a classification system that has the ability to incorporate the coming tide of molecular and genetic information. This paper discusses how the traditional classification of cardiomyopathies based on morphology has evolved due to rapid advances in our understanding of the genetic and molecular basis for many of these clinical entities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Migraine is a common neurological disorder with a strong genetic basis. However, the complex nature of the disorder has meant that few genes or susceptibility loci have been identified and replicated consistently to confirm their involvement in migraine. Approaches to genetic studies of the disorder have included analysis of the rare migraine subtype, familial hemiplegic migraine with several causal genes identified for this severe subtype. However, the exact genetic contributors to the more common migraine subtypes are still to be deciphered. Genome-wide studies such as genome-wide association studies and linkage analysis as well as candidate genes studies have been employed to investigate genes involved in common migraine. Neurological, hormonal and vascular genes are all considered key factors in the pathophysiology of migraine and are a focus of many of these studies. It is clear that the influence of individual genes on the expression of this disorder will vary. Furthermore, the disorder may be dependent on gene–gene and gene–environment interactions that have not yet been considered. In addition, identifying susceptibility genes may require phenotyping methods outside of the International Classification of Headache Disorders II criteria, such as trait component analysis and latent class analysis to better define the ambit of migraine expression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall aim of our research was to characterize airborne particles from selected nanotechnology processes and to utilize the data to develop and test quantitative particle concentration-based criteria that can be used to trigger an assessment of particle emission controls. We investigated particle number concentration (PNC), particle mass (PM) concentration, count median diameter (CMD), alveolar deposited surface area, elemental composition, and morphology from sampling of aerosols arising from six nanotechnology processes. These included fibrous and non-fibrous particles, including carbon nanotubes (CNTs). We adopted standard occupational hygiene principles in relation to controlling peak emission and exposures, as outlined by both Safe Work Australia, (1) and the American Conference of Governmental Industrial Hygienists (ACGIH®). (2) The results from the study were used to analyses peak and 30-minute averaged particle number and mass concentration values measured during the operation of the nanotechnology processes. Analysis of peak (highest value recorded) and 30-minute averaged particle number and mass concentration values revealed: Peak PNC20–1000 nm emitted from the nanotechnology processes were up to three orders of magnitude greater than the local background particle concentration (LBPC). Peak PNC300–3000 nm was up to an order of magnitude greater, and PM2.5 concentrations up to four orders of magnitude greater. For three of these nanotechnology processes, the 30-minute average particle number and mass concentrations were also significantly different from the LBPC (p-value < 0.001). We propose emission or exposure controls may need to be implemented or modified, or further assessment of the controls be undertaken, if concentrations exceed three times the LBPC, which is also used as the local particle reference value, for more than a total of 30 minutes during a workday, and/or if a single short-term measurement exceeds five times the local particle reference value. The use of these quantitative criteria, which we are terming the universal excursion guidance criteria, will account for the typical variation in LBPC and inaccuracy of instruments, while precautionary enough to highlight peaks in particle concentration likely to be associated with particle emission from the nanotechnology process. Recommendations on when to utilize local excursion guidance criteria are also provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Highly sensitive infrared cameras can produce high-resolution diagnostic images of the temperature and vascular changes of breasts. Wavelet transform based features are suitable in extracting the texture difference information of these images due to their scale-space decomposition. The objective of this study is to investigate the potential of extracted features in differentiating between breast lesions by comparing the two corresponding pectoral regions of two breast thermograms. The pectoral regions of breastsare important because near 50% of all breast cancer is located in this region. In this study, the pectoral region of the left breast is selected. Then the corresponding pectoral region of the right breast is identified. Texture features based on the first and the second sets of statistics are extracted from wavelet decomposed images of the pectoral regions of two breast thermograms. Principal component analysis is used to reduce dimension and an Adaboost classifier to evaluate classification performance. A number of different wavelet features are compared and it is shown that complex non-separable 2D discrete wavelet transform features perform better than their real separable counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most intent recognition studies, annotations of query intent are created post hoc by external assessors who are not the searchers themselves. It is important for the field to get a better understanding of the quality of this process as an approximation for determining the searcher's actual intent. Some studies have investigated the reliability of the query intent annotation process by measuring the interassessor agreement. However, these studies did not measure the validity of the judgments, that is, to what extent the annotations match the searcher's actual intent. In this study, we asked both the searchers themselves and external assessors to classify queries using the same intent classification scheme. We show that of the seven dimensions in our intent classification scheme, four can reliably be used for query annotation. Of these four, only the annotations on the topic and spatial sensitivity dimension are valid when compared with the searcher's annotations. The difference between the interassessor agreement and the assessor-searcher agreement was significant on all dimensions, showing that the agreement between external assessors is not a good estimator of the validity of the intent classifications. Therefore, we encourage the research community to consider using query intent classifications by the searchers themselves as test data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite of significant contributions of urban road transport to global economy and society, it is one of the largest sources of local and global emission impact. In order to address the environmental concerns of urban road transport it is imperative to achieve a holistic understanding of contributory factors causing emissions which requires a complete look onto its whole life cycle. Previous studies were mainly based on segmental views which mostly studied environmental impacts of individual transport modes and very few considered impacts other than operational phase. This study develops an integrated life cycle inventory model for urban road transport emissions from a holistic modal perspective. Singapore case was used to demonstrate the model. Results show that total life cycle greenhouse gas emission from Singapore’s road transport sector is 7.8 million tons per year. The total amount of criteria air pollutants are also estimated in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internationally, transit oriented development (TOD) is characterised by moderate to high density development with diverse land use patterns and well connected street networks centred around high frequency transit stops (bus and rail). Although different TOD typologies have been developed in different contexts, they are based on subjective evaluation criteria derived from the context in which they are built and typically lack a validation measure. Arguably there exist sets of TOD characteristics that perform better in certain contexts, and being able to optimise TOD effectiveness would facilitate planning and supporting policy development. This research utilises data from census collection districts (CCDs) in Brisbane with different sets of TOD attributes measured across six objectively quantified built environmental indicators: net employment density, net residential density, land use diversity, intersection density, cul-de-sac density, and public transport accessibility. Using these measures, a Two Step Cluster Analysis was conducted to identify natural groupings of the CCDs with similar profiles, resulting in four unique TOD clusters: (a) residential TODs, (b) activity centre TODs, (c) potential TODs, and; (d) TOD non-suitability. The typologies are validated by estimating a multinomial logistic regression model in order to understand the mode choice behaviour of 10,013 individuals living in these areas. Results indicate that in comparison to people living in areas classified as residential TODs, people who reside in non-TOD clusters were significantly less likely to use public transport (PT) (1.4 times), and active transport (4 times) compared to the car. People living in areas classified as potential TODs were 1.3 times less likely to use PT, and 2.5 times less likely to use active transport compared to using the car. Only a little difference in mode choice behaviour was evident between people living in areas classified as residential TODs and activity centre TODs. The results suggest that: (a) two types of TODs may be suitable for classification and effect mode choice in Brisbane; (b) TOD typology should be developed based on their TOD profile and performance matrices; (c) both bus stop and train station based TODs are suitable for development in Brisbane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Antioestrogens are among the most widely used agents in the treatment of breast cancer. There has been a recent surge of interest in these compounds because of their potential breast cancer chemopreventive properties. The newer generation of antioestrogens, with increased selectivity and better toxicity profiles, have the potential to increase the effectiveness of hormonal treatment of breast cancer. The selective oestrogen receptor modulators (SERMs) hold the promise of revolutionising the care of healthy postmenopausal women with their beneficial effects on bone and lipids in addition to the chemoprevention of breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Post-heart transplant psychological distress may both directly hinder physiological health as well as indirectly impact on clinical outcomes by increasing unhealthy behaviours, such as immunosuppression non-adherence. Reducing psychological distress for heart transplant recipients is therefore vitally important, in order to improve patients’ overall health and well-being but also clinical outcomes, such as morbidity and mortality. Evidence from other populations suggests that non-pharmacological interventions may be an effective strategy. Aim To appraise the efficacy of non-pharmacological interventions on psychological outcomes after heart transplant. Method A systematic review was conducted using the Joanna Briggs Institute methodology. Experimental and quasi-experimental studies that involved any non-pharmacological intervention for heart transplant recipients were included, provided that data on psychological outcomes were reported. Multiple electronic databases were searched for published and unpublished studies and reference lists of retrieved studies were scrutinized for further primary research. Data were extracted using a standardised data extraction tool. Included studies were assessed by two independent reviewers using standardised critical appraisal instruments. Results Three studies fulfilled the inclusion and exclusion criteria, which involved only 125 heart transplant recipients. Two studies reported on exercise programs. One study reported a web-based psychosocial intervention. While psychological outcomes significantly improved from baseline to follow-up for the recipients who received the interventions, between-group comparisons were not reported. The methodological quality of the studies was judged to be poor. Conclusions Further research is required, as we found there is insufficient evidence available to draw conclusions for or against the use of non-pharmacological interventions after heart transplant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.