77 resultados para multilevel statistical modeling
Resumo:
The diffusion model for percutaneous absorption is developed for the specific case of delivery to the skin being limited by the application of a finite amount of solute. Two cases are considered; in the first, there is an application of a finite donor (vehicle) volume, and in the second, there are solvent-deposited solids and a thin vehicle with a high partition coefficient. In both cases, the potential effect of an interfacial resistance at the stratum corneum surface is also considered. As in the previous paper, which was concerned with the application of a constant donor concentration, clearance limitations due to the viable eqidermis, the in vitro sampling rate, or perfusion rate in vivo are included. Numerical inversion of the Laplace domain solutions was used for simulations of solute flux and cumulative amount absorbed and to model specific examples of percutaneous absorption of solvent-deposited solids. It was concluded that numerical inversions of the Laplace domain solutions for a diffusion model of the percutaneous absorption, using standard scientific software (such as SCIENTIST, MicroMath Scientific software) on modern personal computers, is a practical alternative to computation of infinite series solutions. Limits of the Laplace domain solutions were used to define the moments of the flux-time profiles for finite donor volumes and the slope of the terminal log flux-time profile. The mean transit time could be related to the diffusion time through stratum corneum, viable epidermal, and donor diffusion layer resistances and clearance from the receptor phase. Approximate expressions for the time to reach maximum flux (peak time) and maximum flux were also derived. The model was then validated using reported amount-time and flux-time profiles for finite doses applied to the skin. It was concluded that for very small donor phase volume or for very large stratum corneum-vehicle partitioning coefficients (e.g., for solvent deposited solids), the flux and amount of solute absorbed are affected by receptor conditions to a lesser extent than is obvious for a constant donor constant donor concentrations. (C) 2001 Wiley-Liss, Inc. and the American Pharmaceutical Association J Pharm Sci 90:504-520, 2001.
Resumo:
The goal of the current study was to identify discrete longitudinal patterns of change in adolescent smoking using latent growth mixture modeling. Five distinct longitudinal patterns were identified. A group of early rapid escalators was characterized by early escalation (at age 13) that rapidly increased to heavy smoking. A pattern characterized by occasional puffing up until age 15, at which time smoking escalated to moderate levels was also identified (late moderate escalators). Another group included adolescents who, after age 15, began to escalate slowly in their smoking to light (0.5 cigarettes per month) levels (late slow escalators). Finally, a group of stable light smokers (those who smoked 1-2 cigarettes per month) and a group of stable puffers (those. who smoked only a few puffs per month) were also identified. The stable puffer group was the largest group and represented 25% of smokers.
Resumo:
Land related information about the Earth's surface is commonIJ found in two forms: (1) map infornlation and (2) satellite image da ta. Satellite imagery provides a good visual picture of what is on the ground but complex image processing is required to interpret features in an image scene. Increasingly, methods are being sought to integrate the knowledge embodied in mop information into the interpretation task, or, alternatively, to bypass interpretation and perform biophysical modeling directly on derived data sources. A cartographic modeling language, as a generic map analysis package, is suggested as a means to integrate geographical knowledge and imagery in a process-oriented view of the Earth. Specialized cartographic models may be developed by users, which incorporate mapping information in performing land classification. In addition, a cartographic modeling language may be enhanced with operators suited to processing remotely sensed imagery. We demonstrate the usefulness of a cartographic modeling language for pre-processing satellite imagery, and define two nerv cartographic operators that evaluate image neighborhoods as post-processing operations to interpret thematic map values. The language and operators are demonstrated with an example image classification task.
Resumo:
Now that some of the genes involved in asthma and allergy have been identified, interest is turning to how genetic predisposition interacts with exposure to environmental risk factors. These questions are best answered by studies in which both genotypes and other risk factors are measured, but even simpler studies, in which family history is used as a proxy for genotype, have made suggestive findings. For example, early breast feeding may increase the risk of allergic disease in genetically susceptible children, and decrease the risk of 'sporadic' allergy. This review also addresses the overall importance of genetic causes of allergic disease in the general population.
Resumo:
This article reports on the results of a study undertaken by the author together with her research assistant, Heather Green. The study collected and analysed data from all disciplinary tribunal decisions heard in Queensland since 1930 in an attempt to provide empirical information which has previously been lacking. This article will outline the main features of the disciplinary system in Queensland, describe the research methodology used in the present study and then report on some findings from the study. Reported findings include a profile of solicitors who have appeared before a disciplinary hearing, the types of matters which have attracted formal discipline and the types of orders made by the tribunal. Much of the data is then presented on a time scale so as to reveal any changes over time.
Resumo:
The monitoring of infection control indicators including hospital-acquired infections is an established part of quality maintenance programmes in many health-care facilities. However, surveillance data use can be frustrated by the infrequent nature of many infections. Traditional methods of analysis often provide delayed identification of increasing infection occurrence, placing patients at preventable risk. The application of Shewhart, Cumulative Sum (CUSUM) and Exponentially Weighted Moving Average (EWMA) statistical process control charts to the monitoring of indicator infections allows continuous real-time assessment. The Shewhart chart will detect large changes, while CUSUM and EWMA methods are more suited to recognition of small to moderate sustained change. When used together, Shewhart and EWMA methods are ideal for monitoring bacteraemia and multiresistant organism rates. Shewhart and CUSUM charts are suitable for surgical infection surveillance.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
In modeling expectation formation, economic agents are usually viewed as forming expectations adaptively or in accordance with some rationality postulate. We offer an alternative nonlinear model where agents exchange their opinions and information with each other. Such a model yields multiple equilibria, or attracting distributions, that are persistent but subject to sudden large jumps. Using German Federal Statistical Office economic indicators and German IFO Poll expectational data, we show that this kind of model performs well in simulation experiments. Focusing upon producers' expectations in the consumption goods sector, we also discover evidence that structural change in the interactive process occurred over the period of investigation (1970-1998). Specifically, interactions in expectation formation seem to have become less important over time.
Resumo:
This article proposes a more accurate approach to dopant extraction using combined inverse modeling and forward simulation of scanning capacitance microscopy (SCM) measurements on p-n junctions. The approach takes into account the essential physics of minority carrier response to the SCM probe tip in the presence of lateral electric fields due to a p-n junction. The effects of oxide fixed charge and interface state densities in the grown oxide layer on the p-n junction samples were considered in the proposed method. The extracted metallurgical and electrical junctions were compared to the apparent electrical junction obtained from SCM measurements. (C) 2002 American Institute of Physics.
Resumo:
We present a novel maximum-likelihood-based algorithm for estimating the distribution of alignment scores from the scores of unrelated sequences in a database search. Using a new method for measuring the accuracy of p-values, we show that our maximum-likelihood-based algorithm is more accurate than existing regression-based and lookup table methods. We explore a more sophisticated way of modeling and estimating the score distributions (using a two-component mixture model and expectation maximization), but conclude that this does not improve significantly over simply ignoring scores with small E-values during estimation. Finally, we measure the classification accuracy of p-values estimated in different ways and observe that inaccurate p-values can, somewhat paradoxically, lead to higher classification accuracy. We explain this paradox and argue that statistical accuracy, not classification accuracy, should be the primary criterion in comparisons of similarity search methods that return p-values that adjust for target sequence length.