949 resultados para Quantitative methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Body size and development time are important life history traits because they are often highly correlated with fitness. Although the developmental mechanisms that control growth have been well studied, the mechanisms that control how a species-characteristic body size is achieved remain poorly understood. In insects adult body size is determined by the number of larval molts, the size increment at each molt, and the mechanism that determines during which instar larval growth will stop. Adult insects do not grow, so the size at which a larva stops growing determines adult body size. Here we develop a quantitative understanding of the kinetics of growth throughout larval life of Manduca sexta, under different conditions of nutrition and temperature, and for genetic strains with different adult body sizes. We show that the generally accepted view that the size increment at each molt is constant (Dyar's Rule) is systematically violated: there is actually a progressive increase in the size increment from instar to instar that is independent of temperature. In addition, the mass-specific growth rate declines throughout the growth phase in a temperature-dependent manner. We show that growth within an instar follows a truncated Gompertz trajectory. The critical weight, which determines when in an instar a molt will occur, and the threshold size, which determines which instar is the last, are different in genetic strains with different adult body sizes. Under nutrient and temperature stress Manduca has a variable number of larval instars and we show that this is due to the fact that more molts at smaller increments are taken before threshold size is reached. We test whether the new insight into the kinetics of growth and size determination are sufficient to explain body size and development time through a mathematical model that incorporates our quantitative findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intraoperative assessment of surgical margins is critical to ensuring residual tumor does not remain in a patient. Previously, we developed a fluorescence structured illumination microscope (SIM) system with a single-shot field of view (FOV) of 2.1 × 1.6 mm (3.4 mm2) and sub-cellular resolution (4.4 μm). The goal of this study was to test the utility of this technology for the detection of residual disease in a genetically engineered mouse model of sarcoma. Primary soft tissue sarcomas were generated in the hindlimb and after the tumor was surgically removed, the relevant margin was stained with acridine orange (AO), a vital stain that brightly stains cell nuclei and fibrous tissues. The tissues were imaged with the SIM system with the primary goal of visualizing fluorescent features from tumor nuclei. Given the heterogeneity of the background tissue (presence of adipose tissue and muscle), an algorithm known as maximally stable extremal regions (MSER) was optimized and applied to the images to specifically segment nuclear features. A logistic regression model was used to classify a tissue site as positive or negative by calculating area fraction and shape of the segmented features that were present and the resulting receiver operator curve (ROC) was generated by varying the probability threshold. Based on the ROC curves, the model was able to classify tumor and normal tissue with 77% sensitivity and 81% specificity (Youden's index). For an unbiased measure of the model performance, it was applied to a separate validation dataset that resulted in 73% sensitivity and 80% specificity. When this approach was applied to representative whole margins, for a tumor probability threshold of 50%, only 1.2% of all regions from the negative margin exceeded this threshold, while over 14.8% of all regions from the positive margin exceeded this threshold.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. STUDY DESIGN: A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. RESULTS: Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with "massive" PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a "stat" complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. CONCLUSION: The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research validates a computerized dietary selection task (Food-Linked Virtual Response or FLVR) for use in studies of food consumption. In two studies, FLVR task responses were compared with measures of health consciousness, mood, body mass index, personality, cognitive restraint toward food, and actual food selections from a buffet table. The FLVR task was associated with variables which typically predict healthy decision-making and was unrelated to mood or body mass index. Furthermore, the FLVR task predicted participants' unhealthy selections from the buffet, but not overall amount of food. The FLVR task is an inexpensive, valid, and easily administered option for assessing momentary dietary decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/published

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a methodology for the application of computer simulation to evacuation certification of aircraft is suggested. This involves the use of computer simulation, historic certification data, component testing, and full-scale certification trials. The methodology sets out a framework for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. In addition, a phased introduction of computer models to certification is suggested. This involves as a first step the use of computer simulation in conjunction with full-scale testing. The combination of full-scale trial, computer simulation (and if necessary component testing) provides better insight into aircraft evacuation performance capabilities by generating a performance probability distribution rather than a single datum. Once further confidence in the technique is established the requirement for the full-scale demonstration could be dropped. The second step in the adoption of computer simulation for certification involves the introduction of several scenarios based on, for example, exit availability, instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios. This would require the continued development of aircraft evacuation modeling technology to include additional behavioral features common in real accident scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Sahara desert is a significant source of particulate pollution not only to the Mediterranean region, but also to the Atlantic and beyond. In this paper, PM 10 exceedences recorded in the UK and the island of Crete are studied and their source investigated, using Lagrangian Particle Dispersion (LPD) methods. Forward and inverse simulations identify Saharan dust storms as the primary source of these episodes. The methodology used allows comparison between this primary source and other possible candidates, for example large forest fires or volcanic eruptions. Two LPD models are used in the simulations, namely the open source code FLEXPART and the proprietary code HYSPLIT. Driven by the same meteorological fields (the ECMWF MARS archive and the PSU/NCAR Mesoscale model, known as MM5) the codes produce similar, but not identical predictions. This inter-model comparison enables a critical assessment of the physical modelling assumptions employed in each code, plus the influence of boundary conditions and solution grid density. The outputs, in the form of particle concentrations evolving in time, are compared against satellite images and receptor data from multiple ground-based sites. Quantitative comparisons are good, especially in predicting the time of arrival of the dust plume in a particular location.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper details the computational methodology for analysis of the structural behaviour of historic composite structures. The modelling approach is based on finite element analysis and has been developed to aid the efficient and inexpensive computational mechanics of complex composite structures. The discussion is primarily focussed on the modelling methodology and analysis of structural designs that comprise of structural beam components acting as stiffeners to a wider shell part of the structure. A computational strategy for analysis of this type of composite structures that exploits their representation through smeared shell models is detailed in the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluating ship layout for human factors (HF) issues using simulation software such as maritimeEXODUS can be a long and complex process. The analysis requires the identification of relevant evaluation scenarios; encompassing evacuation and normal operations; the development of appropriate measures which can be used to gauge the performance of crew and vessel and finally; the interpretation of considerable simulation data. Currently, the only agreed guidelines for evaluating HFs performance of ship design relate to evacuation and so conclusions drawn concerning the overall suitability of a ship design by one naval architect can be quite different from those of another. The complexity of the task grows as the size and complexity of the vessel increases and as the number and type of evaluation scenarios considered increases. Equally, it can be extremely difficult for fleet operators to set HFs design objectives for new vessel concepts. The challenge for naval architects is to develop a procedure that allows both accurate and rapid assessment of HFs issues associated with vessel layout and crew operating procedures. In this paper we present a systematic and transparent methodology for assessing the HF performance of ship design which is both discriminating and diagnostic. The methodology is demonstrated using two variants of a hypothetical naval ship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the development of new advanced technologies in the area of micro and nano systems. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to provide knowledge of how a pre-defined geometry can be achieved through this direct milling. The geometry characterisation is obtained using a Reduced Order Models (ROM), generated from the results of a mathematical model of the Focused Ion Beam, and Design of Experiment (DoE) methods. In this work, the focus is on the design flow methodology which includes an approach on how to include process parameter uncertainties into the process optimisation modelling framework. A discussion on the impact of the process parameters, and their variations, on the quality and performance of the fabricated structure is also presented. The design task is to identify the optimal process conditions, by altering the process parameters, so that certain reliability and confidence of the application is achieved and the imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article consists of a PowerPoint presentation on integrated reliability and prognostics prediction methodology for power electronic modules. The areas discussed include: power electronics flagship; design for reliability; IGBT module; design for manufacture; power module components; reliability prediction techniques; failure based reliability; etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the key to commercial success in manufacturing is the timely development of new products that are not only functionally fit for purpose but offer high performance and quality throughout their entire lifecycle. In principle, this demands the introduction of a fully developed and optimised product from the outset. To accomplish this, manufacturing companies must leverage existing knowledge in their current technical, manufacturing and service capabilities. This is especially true in the field of tolerance selection and application, the subject area of this research. Tolerance knowledge must be readily available and deployed as an integral part of the product development process. This paper describes a methodology and framework,currently under development in a UK manufacturer, to achieve this objective.