712 resultados para Expanded critical incident approach
Resumo:
Theoretical models are developed for the continuous-wave and pulsed laser incision and cut of thin single and multi-layer films. A one-dimensional steady-state model establishes the theoretical foundations of the problem by combining a power-balance integral with heat flow in the direction of laser motion. In this approach, classical modelling methods for laser processing are extended by introducing multi-layer optical absorption and thermal properties. The calculation domain is consequently divided in correspondence with the progressive removal of individual layers. A second, time-domain numerical model for the short-pulse laser ablation of metals accounts for changes in optical and thermal properties during a single laser pulse. With sufficient fluence, the target surface is heated towards its critical temperature and homogeneous boiling or "phase explosion" takes place. Improvements are seen over previous works with the more accurate calculation of optical absorption and shielding of the incident beam by the ablation products. A third, general time-domain numerical laser processing model combines ablation depth and energy absorption data from the short-pulse model with two-dimensional heat flow in an arbitrary multi-layer structure. Layer removal is the result of both progressive short-pulse ablation and classical vaporisation due to long-term heating of the sample. At low velocity, pulsed laser exposure of multi-layer films comprising aluminium-plastic and aluminium-paper are found to be characterised by short-pulse ablation of the metallic layer and vaporisation or degradation of the others due to thermal conduction from the former. At high velocity, all layers of the two films are ultimately removed by vaporisation or degradation as the average beam power is increased to achieve a complete cut. The transition velocity between the two characteristic removal types is shown to be a function of the pulse repetition rate. An experimental investigation validates the simulation results and provides new laser processing data for some typical packaging materials.
Resumo:
The prospect of the continuous multiplication of life styles, the obsolescence of the traditional typological diagrams, the usability of spaces on different territorial scales, imposes on contemporary architecture the search for new models of living. Limited densities in urban development have produced the erosion of territory, the increase of the harmful emissions and energy consumption. High density housing cannot refuse the social emergency to ensure high quality and low cost dwellings, to a new people target: students, temporary workers, key workers, foreign, young couples without children, large families and, in general, people who carry out public services. Social housing strategies have become particularly relevant in regenerating high density urban outskirts. The choice of this research topic derives from the desire to deal with the recent accommodation emergency, according to different perspectives, with a view to give a contribution to the current literature, by proposing some tools for a correct design of the social housing, by ensuring good quality, cost-effective, and eco-sustainable solutions, from the concept phase, through management and maintenance, until the end of the building life cycle. The purpose of the thesis is defining a framework of guidelines that become effective instruments to be used in designing the social housing. They should also integrate the existing regulations and are mainly thought for those who work in this sector. They would aim at supporting students who have to cope with this particular residential theme, and also the users themselves. The scientific evidence of either the recent specialized literature or the solutions adopted in some case studies within the selected metropolitan areas of Milan, London and São Paulo, it is possible to identify the principles of this new design approach, in which the connection between typology, morphology and technology pursues the goal of a high living standard.
Resumo:
Heusler Materialien wurden bisher vor allem in Volumen- und Dünnfilmproben aufgrund ihrer technischen Bedeutung untersucht. In dieser Arbeit berichtet über die experimentellen Untersuchungen der chemischen Synthese, Struktur, und der magnetischen Eigenschaften von ternären Heusler-Nanopartikeln. Die grundlegenden Aspekte der Physik, Chemie und Materialwissenschaft bezüglich der Heusler Nanopartiikel wurden untersucht. Außerdem wurde eine silicatgestützte Herstellungsmethode für Karbon-ummantelte, ternäre intermetallische Co2FeGa Nanopartikel entwickelt. Die Bildung der L21 Co2FeGa Phase wurde mit Röntgenbeugung (XRD), Extended X-ray Absorption Fine Structure Spektroskopie (EXAFS), und 57Fe Mössbauer Spektroskopie bestätigt. Die Abhängigkeit der Phase und der der Größe der Co2FeGa Nanopartikel vom der Zusammensetzung der Precursor und des Silicats wurden untersucht. Durch das Koppeln der aus Transmissions-Elektronen-Mikroskopie (TEM) gewonnen Teilchengröße und der Mössbauerspektroskopie konnte die kritische Größe für den Übergang von superparamgnetischem zu ferromagnetischem Verhalten von Co2FeGa Nanopartikel ermittelt werden. Die silicatgestützte chemische Synthese von Co2FeGa Nanopartikeln besitzt großes Potential für eine generelle Herstellungsmethode für Co-basierte Heuser Nanopartikel. Des weiteren wurde auch eine chemische Herstellungsmethode von metallischen Nanopartikeln mit Synchrotronstrahlung untersucht, die so gewonnen Nanopartikel sind vielversprechende Materialien für die Nanobiotechnologie und die Nanomedizin.
Resumo:
The aim of this study was to investigate the interconnection between the processes of proliferation, dedifferentiation, and intrinsic redifferentiation (chondrogenic) capacities of human articular chondrocyte (HAC), and to identify markers linking HAC dedifferentiation status with their chondrogenic potential. Cumulative population doublings (PD) of HAC expanded in monolayer culture were determined, and a threshold range of 3.57-4.19 PD was identified as indicative of HAC loss of intrinsic chondrogenic capacity in pellets incubated without added chondrogenic factors. While several specific gene and surface markers defined early HAC dedifferentiation process, no clear correlation with the loss of intrinsic chondrogenic potential could be established. CD90 expression during HAC monolayer culture revealed two subpopulations, with sorted CD90-negative cells showing lower proliferative capacity and higher chondrogenic potential compared to CD90-positive cells. Although these data further validated PD as critical for in vitro chondrogenesis, due to the early shift in expression, CD90 could not be considered for predicting chondrogenic potential of HAC expanded for several weeks. In contrast, an excellent mathematically modeled correlation was established between PD and the decline of HAC expressing the intracellular marker S100, providing a direct link between the number of cell divisions and dedifferentiation/loss of intrinsic chondrogenic capacity. Based on the dynamics of S100-positive HAC during expansion, we propose asymmetric cell division as a potential mechanism of HAC dedifferentiation, and S100 as a marker to assess chondrogenicity of HAC during expansion, of potential value for cell-based cartilage repair treatments.
Resumo:
BACKGROUND: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. OBJECTIVES: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such "extended SSSI displays" from raw data. RESULTS: We present Multi Wave Animator (MWA) framework-a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets.
Resumo:
Incorporation of enediynes into anticancer drugs remains an intriguing yet elusive strategy for the design of therapeutically active agents. Density functional theory was used to locate reactants, products, and transition states along the Bergman cyclization pathways connecting enediynes to reactive para-biradicals. Sum method correction to low-level calculations confirmed B3LYP/6-31G(d,p) as the method of choice in investigating enediynes. Herein described as MI:Sum, calculated reaction enthalpies differed from experiment by an average of 2.1 kcal·mol−1 (mean unsigned error). A combination of strain energy released across the reaction coordinate and the critical intramolecular distance between reacting diynes explains reactivity differences. Where experimental and calculated barrier heights are in disagreement, higher level multireference treatment of the enediynes confirms lower level estimates. Previous work concerning the chemically reactive fragment of esperamcin, MTC, is expanded to our model system MTC2.
Resumo:
BACKGROUND:: The interaction of sevoflurane and opioids can be described by response surface modeling using the hierarchical model. We expanded this for combined administration of sevoflurane, opioids, and 66 vol.% nitrous oxide (N2O), using historical data on the motor and hemodynamic responsiveness to incision, the minimal alveolar concentration, and minimal alveolar concentration to block autonomic reflexes to nociceptive stimuli, respectively. METHODS:: Four potential actions of 66 vol.% N2O were postulated: (1) N2O is equivalent to A ng/ml of fentanyl (additive); (2) N2O reduces C50 of fentanyl by factor B; (3) N2O is equivalent to X vol.% of sevoflurane (additive); (4) N2O reduces C50 of sevoflurane by factor Y. These four actions, and all combinations, were fitted on the data using NONMEM (version VI, Icon Development Solutions, Ellicott City, MD), assuming identical interaction parameters (A, B, X, Y) for movement and sympathetic responses. RESULTS:: Sixty-six volume percentage nitrous oxide evokes an additive effect corresponding to 0.27 ng/ml fentanyl (A) with an additive effect corresponding to 0.54 vol.% sevoflurane (X). Parameters B and Y did not improve the fit. CONCLUSION:: The effect of nitrous oxide can be incorporated into the hierarchical interaction model with a simple extension. The model can be used to predict the probability of movement and sympathetic responses during sevoflurane anesthesia taking into account interactions with opioids and 66 vol.% N2O.
Resumo:
Higher education has a responsibility to educate a democratic citizenry and recent research indicates civic engagement is on the decline in the United States. Through a mixed methodological approach, I demonstrate that the potential exists for well structured short-term international service-learning programming to develop college students’ civic identities. Quantitative analysis of questionnaire data, collected from American college students immediately prior to their participation in a short-term service-learning experience in Northern Ireland and again upon their return to the United States, revealed increases in civic accountability, political efficacy, justice oriented citizenship, and service-learning. Subsequent qualitative analysis of interview transcripts, student journals, and field notes suggested that facilitated critical reflection before, during, and after the experience promoted transformational learning. Emergent themes included: (a) responsibilities to others, (b) the value of international service-learning, (c) crosspollination of ideas, (d) stepping outside the daily routine to facilitate divergent thinking, and (e) the necessity of precursory thinking for sustaining transformations in thinking. The first theme, responsibilities to others, was further divided into subthemes of thinking beyond oneself, raising awareness of responsibility to others, and voting responsibly.
Resumo:
Post-soviet countries are in the process of transformation from a totalitarian order to a democratic one, a transformation which is impossible without a profound shift in people's way of thinking. The group set themselves the task of determining the essence of this shift. Using a multidisciplinary approach, they looked at concrete ways of overcoming the totalitarian mentality and forming that necessary for an open democratic society. They studied the contemporary conceptions of tolerance and critical thinking and looked for new foundations of criticism, especially in hermeneutics. They then sought to substantiate the complementary relation between tolerance and criticism in the democratic way of thinking and to prepare a a syllabus for teaching on the subject in Ukrainian higher education. In a philosophical exploration of tolerance they began with relgious tolerance as its first and most important form. Political and social interests often lay at the foundations of religious intolerance and this implicitly comprised the transition to religious tolerance when conditions changed. Early polytheism was more or less indifferent to dogmatic deviations but monotheism is intolerant of heresies. The damage wrought by the religious wars of the Reformations transformed tolerance into a value. They did not create religious tolerance but forced its recognition as a positive phenomenon. With the weakening of religious institutions in the modern era, the purely political nature of many conflicts became evident and this stimulated the extrapolation of tolerance into secular life. Each historical era has certain acts and operations which may be interpreted as tolerant and these can be classified as to whether or not they are based on the conscious following of the principle of tolerance. This criterion requires the separation of the phenomenon of tolerance from its concept and from tolerance as a value. Only the conjunction of a concept of tolerance with a recognition of its value can transform it into a principle dictating a norm of conscious behaviour. The analysis of the contemporary conception of tolerance focused on the diversity of the concept and concluded that the notions used cannot be combined in the framework of a single more or less simple classification, as the distinctions between them are stimulated by the complexity of the realty considered and the variety of its manifestations. Notions considered in relation to tolerance included pluralism, respect and particular-universal. The rationale of tolerance was also investigated and the group felt that any substantiation of the principle of tolerance must take into account human beings' desire for knowledge. Before respecting or being tolerant of another person different from myself, I should first know where the difference lies, so knowledge is a necessary condition of tolerance.The traditional division of truth into scientific (objective and unique) and religious, moral, political (subjective and so multiple) intensifies the problem of the relationship between truth and tolerance. Science was long seen as a field of "natural" intolerance whereas the validity of tolerance was accepted in other intellectual fields. As tolerance eemrges when there is difference and opposition, it is essentially linked with rivaly and there is a a growing recognition today that unlimited rivalry is neither able to direct the process of development nor to act as creative matter. Social and economic reality has led to rivalry being regulated by the state and a natural requirement of this is to associate tolerance with a special "purified" form of rivalry, an acceptance of the actiivity of different subjects and a specification of the norms of their competition. Tolerance and rivalry should therefore be subordinate to a degree of discipline and the group point out that discipline, including self-discipline, is a regulator of the balance between them. Two problematic aspects of tolerance were identified: why something traditionally supposed to have no positive content has become a human activity today, and whether tolerance has full-scale cultural significance. The resolution of these questions requires a revision of the phenomenon and conception of tolerance to clarify its immanent positive content. This involved an investigation of the contemporary concept of tolerance and of the epistemological foundations of a negative solution of tolerance in Greek thought. An original soution to the problem of the extrapolation of tolerance to scientific knowledge was proposed based on the Duhem-Quine theses and conceptiion of background knowledge. In this way tolerance as a principle of mutual relations between different scientific positions gains an essential epistemological rationale and so an important argument for its own universal status. The group then went on to consider the ontological foundations for a positive solution of this problem, beginning with the work of Poincare and Reichenbach. The next aspect considered was the conceptual foundations of critical thinking, looking at the ideas of Karl Popper and St. Augustine and at the problem of the demarcation line between reasonable criticism and apologetic reasoning. Dogmatic and critical thinking in a political context were also considered, before an investigation of critical thinking's foundations. As logic is essential to critical thinking, the state of this discipline in Ukrainian and Russian higher education was assessed, together with the limits of formal-logical grounds for criticism, the role of informal logical as a basis for critical thinking today, dialectical logic as a foundation for critical thinking and the universality of the contemporary demand for criticism. The search for new foundations of critical thinking covered deconstructivism and critical hermeneutics, including the problem of the author. The relationship between tolerance and criticism was traced from the ancient world, both eastern and Greek, through the transitional community of the Renaissance to the industrial community (Locke and Mill) and the evolution of this relationship today when these are viewed not as moral virtues but as ordinary norms. Tolerance and criticism were discussed as complementary manifestations of human freedom. If the completeness of freedom were accepted it would be impossible to avoid recognition of the natural and legal nature of these manifestations and the group argue that critical tolerance is able to avoid dismissing such negative phenomena as the degradation of taste and manner, pornography, etc. On the basis of their work, the group drew up the syllabus of a course in "Logic with Elements of Critical Thinking, and of a special course on the "Problem of Tolerance".
Resumo:
Lateral segregation of cholesterol- and sphingomyelin-rich rafts and glycerophospholipid-containing non-raft microdomains has been proposed to play a role in a variety of biological processes. The most compelling evidence for membrane segregation is based on the observation that extraction with non-ionic detergents leads to solubilization of a subset of membrane components only. However, one decade later, a large body of inconsistent detergent-extraction data is threatening the very concept of membrane segregation. We have assessed the validity of the existing paradigms and we show the following. (i) The localization of a membrane component within a particular fraction of a sucrose gradient cannot be taken as a yardstick for its solubility: a variable localization of the DRMs (detergent-resistant membranes) in sucrose gradients is the result of complex associations between the membrane skeleton and the lipid bilayer. (ii) DRMs of variable composition can be generated by using a single detergent, the increasing concentration of which gradually extracts one protein/lipid after another. Therefore any extraction pattern obtained by a single concentration experiment is bound to be 'investigator-specific'. It follows that comparison of DRMs obtained by different detergents in a single concentration experiment is prone to misinterpretations. (iii) Depletion of cholesterol has a graded effect on membrane solubility. (iv) Differences in detergent solubility of the members of the annexin protein family arise from their association with chemically different membrane compartments; however, these cannot be attributed to the 'brick-like' raft-building blocks of fixed size and chemical composition. Our findings demonstrate a need for critical re-evaluation of the accumulated detergent-extraction data.
Resumo:
After 75 years of invasive and over 50 years of interventional cardiology, cardiac catheter-based procedures have become the most frequently used interventions of modern medicine. Patients undergoing a percutaneous coronary intervention (PCI) outnumber those with coronary artery bypass surgery by a factor of 2 to 4. The default approach to PCI is the implantation of a (drug-eluting) stent, in spite of the fact that it improves the results of balloon angioplasty only in about 25% of cases. The dominance of stenting over conservative therapy or balloon angioplasty on one hand and bypass surgery on the other hand is a flagrant example of how medical research is digested an applied in real life. Apart from electrophysiological interventions, closure ot the patent foramen ovale and percutaneous replacement of the aortic valve in the elderly have the potential of becoming daily routine procedures in catheterization laboratories around the world. Stem cell regeneration of vessels or heart muscle, on the other hand, may remain a dream never to come true.
Resumo:
The prevention of ischaemia and the adequate restitution of blood flow to ischaemic tissue are pivotal to halt the progression of cellular injury associated with decreased oxygen and nutrient supply. Accordingly, the search for novel strategies which aim at preventing ischaemia-reperfusion-induced tissue damage is still of major interest in flap surgery. Preconditioning represents an elegant approach to render the tissue more resistant against deleterious ischaemic insults. For many decades, 'surgical delay' has been the standard method of tissue preconditioning. During the last 10 years, ischaemic preconditioning was added to the repertoire of plastic surgeons to protect flaps from ischaemic necrosis. The invasiveness and expenditure of time of these procedures, however, have always been major drawbacks, hindering a wide distribution in clinical practice. Consequently, the motivation has all along been to further refine and simplify protective strategies. Recent experimental studies have now shown that efficient protection from ischaemic necrosis can also be achieved by remote preconditioning or pretreatment with chemical agents and growth factors, which mimic the action of surgical delay and ischaemic preconditioning. In addition, the local application of unspecific stressors, including both heating and cooling, have been shown to effectively improve flap microcirculation and, thus, tissue survival. In view of successful translational research, it is now time that the efficacy of these novel preconditioning procedures is proven in prospective randomised clinical trials.
Resumo:
This project proposes a module for teaching visual composition within the context of a written composition course. Drawing from process writing theory, critical pedagogy, and photo-elicitation, “Composing In Words And Images” gives composition teachers a module and direct instruction for the incorporation of critical visual composition studies in their writing classes.
Resumo:
Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.
Resumo:
Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.