712 resultados para Expanded critical incident approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Dissertation befasst sich mit der Einführung komplexer Softwaresysteme, die, bestehend aus einer Kombination aus parametrisierter Standardsoftware gepaart mit Wettbewerbsvorteil sichernden Individualsoftwarekomponenten, keine Software-Engineering-Projekte im klassischen Sinn mehr darstellen, sondern einer strategieorientierten Gestaltung von Geschäftsprozessen und deren Implementierung in Softwaresystemen bedürfen. Die Problemstellung einer adäquaten Abwägung zwischen TCO-optimierender Einführung und einer gleichzeitigen vollständigen Unterstützung der kritischen Erfolgsfaktoren des Unternehmens ist hierbei von besonderer Bedeutung. Der Einsatz integrierter betriebswirtschaftlicher Standardsoftware, mit den Möglichkeiten einer TCO-Senkung, jedoch ebenfalls der Gefahr eines Verlustes von Alleinstellungsmerkmalen am Markt durch Vereinheitlichungstendenzen, stellt ein in Einführungsprojekten wesentliches zu lösendes Problem dar, um Suboptima zu vermeiden. Die Verwendung von Vorgehensmodellen, die sich oftmals an klassischen Softwareentwicklungsprojekten orientieren oder vereinfachte Phasenmodelle für das Projektmanagement darstellen, bedingt eine fehlende Situationsadäquanz in den Detailsituationen der Teilprojekte eines komplexen Einführungsprojektes. Das in dieser Arbeit entwickelte generische Vorgehensmodell zur strategieorientierten und partizipativen Einführung komplexer Softwaresysteme im betriebswirtschaftlichen Anwendungsbereich macht - aufgrund der besonders herausgearbeiteten Ansätze zu einer strategieorientierten Einführung, respektive Entwicklung derartiger Systeme sowie aufgrund der situationsadäquaten Vorgehensstrategien im Rahmen der Teilprojektorganisation � ein Softwareeinführungsprojekt zu einem Wettbewerbsfaktor stärkenden, strategischen Element im Unternehmen. Die in der Dissertation diskutierten Überlegungen lassen eine Vorgehensweise präferieren, die eine enge Verschmelzung des Projektes zur Organisationsoptimierung mit dem Softwareimplementierungsprozess impliziert. Eine Priorisierung der Geschäftsprozesse mit dem Ziel, zum einen bei Prozessen mit hoher wettbewerbsseitiger Priorität ein organisatorisches Suboptimum zu vermeiden und zum anderen trotzdem den organisatorischen Gestaltungs- und den Systemimplementierungsprozess schnell und ressourcenschonend durchzuführen, ist ein wesentliches Ergebnis der Ausarbeitungen. Zusätzlich führt die Ausgrenzung weiterer Prozesse vom Einführungsvorgang zunächst zu einem Produktivsystem, welches das Unternehmen in den wesentlichen Punkten abdeckt, das aber ebenso in späteren Projektschritten zu einem System erweitert werden kann, welches eine umfassende Funktionalität besitzt. Hieraus ergeben sich Möglichkeiten, strategischen Anforderungen an ein modernes Informationssystem, das die kritischen Erfolgsfaktoren eines Unternehmens konsequent unterstützen muss, gerecht zu werden und gleichzeitig ein so weit als möglich ressourcenschonendes, weil die Kostenreduktionsaspekte einer Standardlösung nutzend, Projekt durchzuführen. Ein weiterer wesentlicher Aspekt ist die situationsadäquate Modellinstanziierung, also die projektspezifische Anpassung des Vorgehensmodells sowie die situationsadäquate Wahl der Vorgehensweisen in Teilprojekten und dadurch Nutzung der Vorteile der verschiedenen Vorgehensstrategien beim konkreten Projektmanagement. Der Notwendigkeit der Entwicklung einer Projektorganisation für prototypingorientiertes Vorgehen wird in diesem Zusammenhang ebenfalls Rechnung getragen. Die Notwendigkeit der Unternehmen, sich einerseits mit starken Differenzierungspotenzialen am Markt hervorzuheben und andererseits bei ständig sinkenden Margen einer Kostenoptimierung nachzukommen, lässt auch in Zukunft das entwickelte Modell als erfolgreich erscheinen. Hinzu kommt die Tendenz zu Best-Of-Breed-Ansätzen und komponentenbasierten Systemen im Rahmen der Softwareauswahl, die eine ausgesprochen differenzierte Vorgehensweise in Projekten verstärkt notwendig machen wird. Durch die in das entwickelte Modell integrierten Prototyping-Ansätze wird der auch in Zukunft an Bedeutung gewinnenden Notwendigkeit der Anwenderintegration Rechnung getragen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use a microscopic theory to describe the dynamics of the valence electrons in divalent-metal clusters. The theory is based on a many-body model Harniltonian H which takes into account, on the same electronic level, the van der Waals and the covalent bonding. In order to study the ground-state properties of H we have developed an extended slave-boson method. We have studied the bonding character and the degree of electronic delocalization in Hg_n clusters as a function of cluster size. Results show that, for increasing cluster size, an abrupt change occurs in the bond character from van der Waals to covalent bonding at a critical cluster size n_c ~ 10-20. This change also involves a transition from localized to delocalized valence electrons, as a consequence of the competition between both bonding mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing has evolved to become a critical element of the competitive skill set of defense aerospace firms. Given the changes in the acquisition environment and culture; traditional “thrown over the wall” means of developing and manufacturing products are insufficient. Also, manufacturing systems are complex systems that need to be carefully designed in a holistic manner and there are shortcomings with available tools and methods to assist in the design of these systems. This paper outlines the generation and validation of a framework to guide this manufacturing system design process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing has evolved to become a critical element of the competitive skill set of defense aerospace firms. Given the changes in the acquisition environment and culture; traditional “thrown over the wall” means of developing and manufacturing products are insufficient. Also, manufacturing systems are complex systems that need to be carefully designed in a holistic manner and there are shortcomings with available tools and methods to assist in the design of these systems. This paper outlines the generation and validation of a framework to guide this manufacturing system design process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivos: Determinar si existe diferencia en la ganancia interdialítica entre los pacientes al ser tratados con flujo de dializado (Qd) de 400 mL/min y 500 mL/min. Diseño: Se realizó un estudio de intervención, cruzado, aleatorizado, doble ciego en pacientes con enfermedad renal crónica en hemodiálisis para determinar diferencias en la ganancia de peso interdialítica entre los pacientes tratados con flujo de dializado (Qd) de 400 ml/min y 500 ml/min. Pacientes: Se analizaron datos de 46 pacientes en hemodiálisis crónica con Qd de 400 ml/min y 45 con Qd de 500 ml/min. Análisis: La prueba de hipótesis para evaluar diferencias en la ganancia interdialítica y las otras variables entre los grupos se realizó mediante la prueba T para muestras pareadas. Para el análisis de correlación se calculó el coeficiente de Pearson. Resultados: No hubo diferencia significativa en ganancia interdialítica usando Qd de 400 ml/min vs 500 ml/min (2.37 ± 0.7 vs 2.41 ± 0.6, p=0.41) ni en Kt/V (1.57 ± 0.25 vs 1.59 ± 0.23, p = 0.45), potasio (4.9 ± 1.1 vs 5.1 ± 1.0, p=0.45), fosforo (4.5 ± 1.2 vs 4.4 ± 1.2, p=0.56) o hemoglobina (11.3 ± 1.8 vs 11.3 ± 1.6, p=0.96). Conclusiones: En pacientes con peso ≤ 65 Kg el uso de Qd de 400 ml/min no se asocia con menor ganancia interdialítica de peso. No hay diferencia en la eficiencia de diálisis lo que sugiere que es una intervención segura a corto plazo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introducción: El delirium es un trastorno de conciencia de inicio agudo asociado a confusión o disfunción cognitiva, se puede presentar hasta en 42% de pacientes, de los cuales hasta el 80% ocurren en UCI. El delirium aumenta la estancia hospitalaria, el tiempo de ventilación mecánica y la morbimortalidad. Se pretendió evaluar la prevalencia de periodo de delirium en adultos que ingresaron a la UCI en un hospital de cuarto nivel durante 2012 y los factores asociados a su desarrollo. Metodología Se realizó un estudio transversal con corte analítico, se incluyeron pacientes hospitalizados en UCI médica y UCI quirúrgica. Se aplicó la escala de CAM-ICU y el Examen Mínimo del Estado Mental para evaluar el estado mental. Las asociaciones significativas se ajustaron con análisis multivariado. Resultados: Se incluyeron 110 pacientes, el promedio de estancia fue 5 días; la prevalencia de periodo de delirium fue de 19.9%, la mediana de edad fue 64.5 años. Se encontró una asociación estadísticamente significativa entre el delirium y la alteración cognitiva de base, depresión, administración de anticolinérgicos y sepsis (p< 0,05). Discusión Hasta la fecha este es el primer estudio en la institución. La asociación entre delirium en la UCI y sepsis, uso de anticolinérgicos, y alteración cognitiva de base son consistentes y comparables con factores de riesgo descritos en la literatura mundial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computational approach to the Hirshfeld [Theor. Chim. Acta 44, 129 (1977)] atom in a molecule is critically investigated, and several difficulties are highlighted. It is shown that these difficulties are mitigated by an alternative, iterative version, of the Hirshfeld partitioning procedure. The iterative scheme ensures that the Hirshfeld definition represents a mathematically proper information entropy, allows the Hirshfeld approach to be used for charged molecules, eliminates arbitrariness in the choice of the promolecule, and increases the magnitudes of the charges. The resulting "Hirshfeld-I charges" correlate well with electrostatic potential derived atomic charges

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis which follows, entitled ''The Postoccidental Deconstruction and Resignification of 'Modemity': A Critical Analysis", is an exposition and criticism of the critique of occidental modemity found in a group of writings which identify their critique with a "postoccidental" point of view with respect to postcolonial studies. The general problem ofthe investigation concems the significance and reach ofthis critique of modemity in relation to the ongoing debate, in Latín American studies, about the historical relationship between Latín America, as a mu1ticultural/ structurally heterogeneous region, and the industrial societies of Euro pe and North America. A brief Preface explains the genealogy of the author's ideas on this subject Following this preface, the thesis proceeds to analyze the writings in this corpus through an intertextual, schematic approach which singles out two rnajor elements of the postoccidental critique: "coloniality" and "eurocentrism". These two main elements are investigated in the Introduction and Chapters One and Two, in terms of how they distinguish postoccidental analysis from other theoretical tendencias with which it has affinities but whose key concepts it reformu1ates in ways that are key to the unique approach which postoccidental analysis takes to modemity, the nature of the capitalist world system, colonialism, subaltemization, center/periphery and development . Chapter Three attempts a critical analysis of the foregoing postoccidentalist deconstruction according to the following question: to what extent does it succeed in deconstructing "modernity" as a term which refers to a historically articulated set of discourses whose underlying purpose has been to justify European and North American hegemony and structural asymmetries vis-a-vis the peripheries of the capitalist world system, based on an ethnocentric, racialist logic of exploitation and subalternization of non-European peoples? A Conclusion follows Chapter Three.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Focus on “social determinants of health” provides a welcome alternative to the bio-medical illness paradigm. However, the tendency to concentrate on the influence of “risk factors” related to living and working conditions of individuals, rather than to more broadly examine dynamics of the social processes that affect population health, has triggered critical reaction not only from the Global North but especially from voices the Global South where there is a long history of addressing questions of health equity. In this article, we elaborate on how focusing instead on the language of “social determination of health” has prompted us to attempt to apply a more equity-sensitive approaches to research and related policy and praxis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The degree to which perceived controllability alters the way a stressor is experienced varies greatly among individuals. We used functional magnetic resonance imaging to examine the neural activation associated with individual differences in the impact of perceived controllability on self-reported pain perception. Subjects with greater activation in response to uncontrollable (UC) rather than controllable (C) pain in the pregenual anterior cingulate cortex (pACC), periaqueductal gray (PAG), and posterior insula/SII reported higher levels of pain during the UC versus C conditions. Conversely, subjects with greater activation in the ventral lateral prefrontal cortex (VLPFC) in anticipation of pain in the UC versus C conditions reported less pain in response to UC versus C pain. Activation in the VLPFC was significantly correlated with the acceptance and denial subscales of the COPE inventory [Carver, C. S., Scheier, M. F., & Weintraub, J. K. Assessing coping strategies: A theoretically based approach. Journal of Personality and Social Psychology, 56, 267–283, 1989], supporting the interpretation that this anticipatory activation was associated with an attempt to cope with the emotional impact of uncontrollable pain. A regression model containing the two prefrontal clusters (VLPFC and pACC) predicted 64% of the variance in pain rating difference, with activation in the two additional regions (PAG and insula/SII) predicting almost no additional variance. In addition to supporting the conclusion that the impact of perceived controllability on pain perception varies highly between individuals, these findings suggest that these effects are primarily top-down, driven by processes in regions of the prefrontal cortex previously associated with cognitive modulation of pain and emotion regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In areas such as drug development, clinical diagnosis and biotechnology research, acquiring details about the kinetic parameters of enzymes is crucial. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. We demonstrate that a Bayesian approach (the use of prior knowledge) can produce major gains quantifiable in terms of information, productivity and accuracy of each experiment. Developing the use of Bayesian Utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has enabled the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-M and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of scattering of time harmonic acoustic waves by an unbounded sound soft surface which is assumed to lie within a finite distance of some plane. The paper is concerned with the study of an equivalent variational formulation of this problem set in a scale of weighted Sobolev spaces. We prove well-posedness of this variational formulation in an energy space with weights which extends previous results in the unweighted setting [S. Chandler-Wilde and P. Monk, SIAM J. Math. Anal., 37 (2005), pp. 598–618] to more general inhomogeneous terms in the Helmholtz equation. In particular, in the two-dimensional case, our approach covers the problem of plane wave incidence, whereas in the three-dimensional case, incident spherical and cylindrical waves can be treated. As a further application of our results, we analyze a finite section type approximation, whereby the variational problem posed on an infinite layer is approximated by a variational problem on a bounded region.