903 resultados para General theory of fields and particles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A range of models describing metapopulations is surveyed and their implications for conservation biology are described. An overview of the use of both population genetic elements and demographic theory in metapopulation models is given. It would appear that most of the current models suffer from either the use of over-simplified demography or the avoidance of selectively important genetic factors. The scale for which predictions are made by the various models is often obscure. A conceptual framework for describing metapopulations by utilising the concept of fitness of local populations is provided and some examples are given. The expectation that any general theory, such as that of metapopulations, can make useful predictions for particular problems of conservation is examined and compared with the prevailing 'state of the art' recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report on a review of selected general and application controls over the Iowa State University of Science and Technology Room and Board System for the period of April 9, 2012 through May 1, 2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collectively, research aimed to understand the regeneration of certain tissues has unveiled the existence of common key regulators. Knockout studies of the murine Nuclear Factor I-C (NFI-C) transcription factor revealed a misregulation of growth factor signaling, in particular that of transforming growth factor ß-1 (TGF-ßl), which led to alterations of skin wound healing and the growth of its appendages, suggesting it may be a general regulator of regenerative processes. We sought to investigate this further by determining whether NFI-C played a role in liver regeneration. Liver regeneration following two-thirds removal of the liver by partial hepatectomy (PH) is a well-established regenerative model whereby changes elicited in hepatocytes following injury lead to a rapid, phased proliferation. However, mechanisms controlling the action of liver proliferative factors such as transforming growth factor-ßl (TGF-ß1) and plasminogen activator inhibitor-1 (PAI-1) remain largely unknown. We show that the absence of NFI-C impaired hepatocyte proliferation due to an overexpression of PAI-1 and the subsequent suppression of urokinase plasminogen (uPA) activity and hepatocyte growth factor (HGF) signaling, a potent hepatocyte mitogen. This indicated that NFI-C first acts to promote hepatocyte proliferation at the onset of liver regeneration in wildtype mice. The subsequent transient down regulation of NFI-C, as can be explained by a self- regulatory feedback loop with TGF-ßl, may limit the number of hepatocytes entering the first wave of cell division and/or prevent late initiations of mitosis. Overall, we conclude that NFI-C acts as a regulator of the phased hepatocyte proliferation during liver regeneration. Taken together with NFI-C's actions in other in vivo models of (re)generation, it is plausible that NFI-C may be a general regulator of regenerative processes. - L'ensemble des recherches visant à comprendre la régénération de certains tissus a permis de mettre en évidence l'existence de régulateurs-clés communs. L'étude des souris, dépourvues du gène codant pour le facteur de transcription NFI-C (Nuclear Factor I-C), a montré des dérèglements dans la signalisation de certains facteurs croissance, en particulier du TGF-ßl (transforming growth factor-ßl), ce qui conduit à des altérations de la cicatrisation de la peau et de la croissance des poils et des dents chez ces souris, suggérant que NFI-C pourrait être un régulateur général du processus de régénération. Nous avons cherché à approfondir cette question en déterminant si NFI-C joue un rôle dans la régénération du foie. La régénération du foie, induite par une hépatectomie partielle correspondant à l'ablation des deux-tiers du foie, constitue un modèle de régénération bien établi dans lequel la lésion induite conduit à la prolifération rapide des hépatocytes de façon synchronisée. Cependant, les mécanismes contrôlant l'action de facteurs de prolifération du foie, comme le facteur de croissance TGF-ßl et l'inhibiteur de l'activateur du plasminogène PAI-1 (plasminogen activator inhibitor-1), restent encore très méconnus. Nous avons pu montrer que l'absence de NFI-C affecte la prolifération des hépatocytes, occasionnée par la surexpression de PAI-1 et par la subséquente suppression de l'activité de la protéine uPA (urokinase plasminogen) et de la signalisation du facteur de croissance des hépatocytes HGF (hepatocyte growth factor), un mitogène puissant des hépatocytes. Cela indique que NFI-C agit en premier lieu pour promouvoir la prolifération des hépatocytes au début de la régénération du foie chez les souris de type sauvage. La subséquente baisse transitoire de NFI-C, pouvant s'expliquer par une boucle rétroactive d'autorégulation avec le facteur TGF-ßl, pourrait limiter le nombre d'hépatocytes qui entrent dans la première vague de division cellulaire et/ou inhiber l'initiation de la mitose tardive. L'ensemble de ces résultats nous a permis de conclure que NFI-C agit comme un régulateur de la prolifération des hépatocytes synchrones au cours de la régénération du foie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new model for dealing with decision making under risk by considering subjective and objective information in the same formulation is here presented. The uncertain probabilistic weighted average (UPWA) is also presented. Its main advantage is that it unifies the probability and the weighted average in the same formulation and considering the degree of importance that each case has in the analysis. Moreover, it is able to deal with uncertain environments represented in the form of interval numbers. We study some of its main properties and particular cases. The applicability of the UPWA is also studied and it is seen that it is very broad because all the previous studies that use the probability or the weighted average can be revised with this new approach. Focus is placed on a multi-person decision making problem regarding the selection of strategies by using the theory of expertons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report on a review of selected general and application controls over the Iowa State University of Science and Technology student financial aid system for the period of April 22, 2013 through May 17, 2013

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formation of silicon particles in rf glow discharges has attracted attention due to their effect as a contaminant during film deposition or etching. However, silicon and silicon alloy powders produced by plasma¿enhanced chemical vapor deposition (PECVD) are promising new materials for sintering ceramics, for making nanoscale filters, or for supporting catalytic surfaces. Common characteristics of these powders are their high purity and the easy control of their stoichiometry through the composition of the precursor gas mixture. Plasma parameters also influence their structure. Nanometric powders of silicon¿carbon alloys exhibiting microstructural properties such as large hydrogen content and high surface/volume ratio have been produced in a PECVD reactor using mixtures of silane and methane at low pressure (-1 Torr) and low frequency square¿wave modulated rf power (13.56 MHz). The a¿Si1¿xCx:H powders were obtained from different precursor gas mixtures, from R=0.05 to R=9, where R=[SiH4]/([SiH4]+[CH4]). The structure of the a¿Si1¿xCx:H powder was analyzed by several techniques. The particles appeared agglomerated, with a wide size distribution between 5 and 100 nm. The silane/methane gas mixture determined the vibrational features of these powders in the infrared. Silicon-hydrogen groups were present for every gas composition, whereas carbon¿hydrogen and silicon¿carbon bonds appeared in methane¿rich mixtures (R-0.6). The thermal desorption of hydrogen revealed two main evolutions at about 375 and 660¿°C that were ascribed to hydrogen bonded to silicon and carbon, respectively. The estimated hydrogen atom concentration in the sample was about 50%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monodispersed colloidal crystals based on silica sub-micrometric particles were synthesized using the Stöber-Fink-Bohn process. The control of nucleation and coalescence result in improved characteristics such as high sphericity and very low size dispersion. The resulting silica particles show characteristics suitable for self-assembling across large areas of closely-packed 2D crystal monolayers by an accurate Langmuir-Blodgett deposition process on glass, fused silica and silicon substrates. Due to their special optical properties, colloidal films have potential applications in fields including photonics, electronics, electro-optics, medicine (detectors and sensors), membrane filters and surface devices. The deposited monolayers of silica particles were characterized by means of FESEM, AFM and optical transmittance measurements in order to analyze their specific properties and characteristics. We propose a theoretical calculation for the photonic band gaps in 2D systems using an extrapolation of the photonic behavior of the crystal from 3D to 2D. In this work we show that the methodology used and the conditions in self-assembly processes are decisive for producing high-quality two-dimensional colloidal crystals by the Langmuir-Blodgett technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUME: Introduction L'objectif de cette étude prospective de cohorte était d'estimer l'efficacité d'un processus de prise en charge standardisé de patients dépendants de l'alcool dans le contexte d'un hôpital universitaire de soins généraux. Ce modèle de prise en charge comprenait une évaluation multidisciplinaire puis des propositions de traitements individualisées et spécifiques (« projet thérapeutique »). Patients et méthode 165 patients alcoolo-dépendants furent recrutés dans différents services de l'hôpital universitaire, y compris la policlinique de médecine. Ils furent dans un premier temps évalués par une équipe multidisciplinaire (médecin interniste, psychiatre, assistant social), puis un projet thérapeutique spécialisé et individualisé leur fut proposé lors d'une rencontre réunissant le patient et l'équipe. Tous les patients éligibles acceptant de participer à l'étude (n=68) furent interrogés au moment de l'inclusion puis 2 et 6 mois plus tard par une psychologue. Des informations standardisées furent recueillies sur les caractéristiques des patients, le processus de prise en charge et l'évolution à 6 mois. Les critères de succès utilisés à 6 mois furent: l'adhérence au traitement proposé et l'abstinence d'alcool. Résultats Lors de l'évaluation à 6 mois, 43% des patients étaient toujours en traitement et 28% étaient abstinents. Les variables prédictrices de succès parmi les caractéristiques des patients étaient un âge de plus de 45 ans, ne pas vivre seul, avoir un travail et être motivé pour un traitement (RAATE-A <18). Pour les variables dépendantes du processus de prise en charge, un sevrage complet de l'alcool lors de la rencontre multidisciplinaire ainsi que la présence de tous les membres de l'équipe à cette réunion étaient des facteurs associés au succès. Conclusion L'efficacité de ce modèle d'intervention pour patients dépendants de l'alcool en hôpital de soins généraux s'est montrée satisfaisante, en particulier pour le critère de succès adhérence au traitement. Des variables associées au succès ou à l'échec à 6 mois ont pu être mises en évidence, permettant d'identifier des populations de patients évoluant différemment. Des stratégies de prise en charge tenant compte de ces éléments pourraient donc être développées, permettant de proposer des traitements plus adaptés ainsi qu'une meilleure rétention des patients alcooliques dans les programmes thérapeutiques. ABSTRACT. To assess the effectiveness of a multidisciplinary evaluation and referral process in a prospective cohort of general hospital patients with alcohol dependence, alcohol-dependent patients were identified in the wards of the general hospital and its primary care center. They were evaluated and then referred to treatment by a multidisciplinary team; those patients who accepted to participate in this cohort study were consecutively included and followed for 6 months. Not included patients were lost for follow-up, whereas all included patients were assessed at time of inclusion, 2 and 6 months later by a research psychologist in order to collect standardized baseline patients' characteristics, process salient features and patients outcomes (defined as treatment adherence and abstinence). Multidisciplinary evaluation and therapeutic referral was feasible and effective, with a success rate of 43% for treatment adherence and 28% for abstinence at 6 months. Among patients' characteristics, predictors of success were an age over 45, not living alone, being employed and being motivated to treatment (RAATE-A score < 18), whereas successful process characteristics included detoxification of the patient at time of referral and a full multidisciplinary referral meeting. This multidisciplinary model of evaluation and referral of alcohol dependent patients of a general hospital had a satisfactory level of effectiveness. Predictors of success and failure allow the identification of subsets of patients for whom new strategies of motivation and treatment referral should be designed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report on a review of selected general and application controls over the Iowa State University of Science and Technology Kuali Financial System for the period April 30, 2014 through May 28, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Back-focal-plane interferometry is used to measure displacements of optically trapped samples with very high spatial and temporal resolution. However, the technique is closely related to a method that measures the rate of change in light momentum. It has long been known that displacements of the interference pattern at the back focal plane may be used to track the optical force directly, provided that a considerable fraction of the light is effectively monitored. Nonetheless, the practical application of this idea has been limited to counter-propagating, low-aperture beams where the accurate momentum measurements are possible. Here, we experimentally show that the connection can be extended to single-beam optical traps. In particular, we show that, in a gradient trap, the calibration product κ·β (where κ is the trap stiffness and 1/β is the position sensitivity) corresponds to the factor that converts detector signals into momentum changes; this factor is uniquely determined by three construction features of the detection instrument and does not depend, therefore, on the specific conditions of the experiment. Then, we find that force measurements obtained from back-focal-plane displacements are in practice not restricted to a linear relationship with position and hence they can be extended outside that regime. Finally, and more importantly, we show that these properties are still recognizable even when the system is not fully optimized for light collection. These results should enable a more general use of back-focal-plane interferometry whenever the ultimate goal is the measurement of the forces exerted by an optical trap.