910 resultados para Process models
Resumo:
In recent history, there has been a trend of increasing partisan polarization throughout most of the American political system. Some of the impacts of this polarization are obvious; however, there is reason to believe that we miss some of the indirect effects of polarization. Accompanying the trend of increased polarization has been an increase in the contentiousness of the Supreme Court confirmation process. I believe that these two trends are related. Furthermore, I argue that these trends have an impact on judicial behavior. This is an issue worth exploring, since the Supreme Court is the most isolated branch of the federal government. The Constitution structured the Supreme Court to ensure that it was as isolated as possible from short-term political pressures and interests. This study attempts to show how it may be possible that those goals are no longer being fully achieved. My first hypothesis in this study is that increases in partisan polarization are a direct cause of the increase in the level of contention during the confirmation process. I then hypothesize that the more contention a justice faces during his or her confirmation process, the more ideologically extreme that justice will then vote on the bench. This means that a nominee appointed by a Republican president will tend to vote even more conservatively than was anticipated following a contentious confirmation process, and vice versa for Democratic appointees. In order to test these hypotheses, I developed a data set for every Supreme Court nominee dating back to President Franklin D. Roosevelt¿s appointments (1937). With this data set, I ran a series of regression models to analyze these relationships. Statistically speaking, the results support my first hypothesis in a fairly robust manner. My regression results for my second hypothesis indicate that the trend I am looking for is present for Republican nominees. For Democratic nominees, the impacts are less robust. Nonetheless, as the results will show, contention during the confirmation process does seem to have some impact on judicial behavior. Following my quantitative analysis, I analyze a series of case studies. These case studies serve to provide tangible examples of these statistical trends as well as to explore what else may be going on during the confirmation process and subsequent judicial decision-making. I use Justices Stevens, Rehnquist, and Alito as the subjects for these case studies. These cases will show that the trends described above do seem to be identifiable at the level of an individual case. These studies further help to indicate other potential impacts on judicial behavior. For example, following Justice Rehnquist¿s move from Associate to Chief Justice, we see a marked change in his behavior. Overall, this study serves as a means of analyzing some of the more indirect impacts of partisan polarization in modern politics. Further, the study offers a means of exploring some of the possible constraints (both conscious and subconscious) that Supreme Court justices may feel while they decide how to cast a vote in a particular case. Given the wide-reaching implications of Supreme Court decisions, it is important to try to grasp a full view of how these decisions are made.
Resumo:
In Western societies the increase in female employment (especially among married women) is seen as having brought about the crisis of the traditional model of the family, reinforcing the position of the "modern" model - the egalitarian family with two working spouses and a "dual-career" family. In contrast, the transitional situation in the post-communist countries during the 1990s is producing a crisis of the family with two working spouses (the basic type of the communist period) and leading to new power relations within the family. While the growth of dual-earner households in this century has implied modification of family models towards greater symmetry of responsibility for breadwinning and homemaking, there is considerable evidence that women's increased employment does not necessarily lead to a more egalitarian approach to gender roles within the family. The group set out to investigate the economic situation of families and economic power within the fame as a crucial factor in the transformation of families with two working spouses in order to reveal the specific patterns of gender contracts and power relations within the family that are emerging in response to the current political and economic transformation. They opted for a comparative approach, selecting the Czech Republic as a country where the very similar tendencies of a few years ago (almost 100% of women employed and the family as a realm of considerable private freedom where both women's and men's gender identities and the traditional distribution of family responsibilities were largely preserved) are combined with a very different experience in terms of economic inequalities during the 1990s to that of Russia. In the first stage of the study they surveyed 300 married couples (150 in each country) on the question of breadwinning. They then carried out in-depth interviews with 10 couples from each country (selected from among the educated layers of the population), focusing on the process of the social construction of gender, using breadwinning and homemaking as gender boundaries which distinguish men from women. By analysing changes in social position and the type of interpersonal interaction of spouses they distinguished two main types of family contracts: the neo-traditional "communal sharing" (with male breadwinner, traditional distribution of family chores and negotiated family power) and the modern one based on negotiated agreement. The most important pre-conditions of husband-wife agreement about breadwinning seemed to imply their overall gender ideology rather than the economic and/or family circumstances. In general, wives were more likely to express egalitarian views, supporting the blurring or even elimination of many gender boundaries. Husbands, on the other hand, more often gave responses calling for the continued maintenance of gender boundaries. The analysis showed that breadwinning is still an important gender boundary in these cultures, one that is assumed unless it is explicitly questioned and that is seen as part of what makes a man a "real man". The majority of respondents seemed to be committed to egalitarian ideology on gender roles and the distribution of family tasks, including decision making, but this is contradicted by the persistent idea of the husband as the breadwinner. This contradiction is more characteristic of the Russian situation than of the Czech. The quantitative study showed a difference in prevailing family models between the two countries, with a clearer shift towards the traditional family contract in the Russian case. The Czechs were more likely to consider their partnerships as based on negotiated agreement, while the Russians saw theirs as based on egalitarian contract, in both cases seeing this as the norm. The majority of couples said they felt satisfied with their marriage, although in both countries wives seemed to be less satisfied. There was however a difference in the issues that aroused dissatisfaction, with Czech women being more sensitive to issues such as self-realisation, personal independence, understanding and recognition in the family, and Russians to issues of love, understanding and recognition. The most disputed area for the majority of families was chores in the home, presumably because in many families both husband and wife were working hard outside the home and because a number of partners had differing views as to the ideal distribution of chores within the family. The distribution of power in the family seems to be linked to the level of well being. The analysis showed that in the dominant democratic model there is still an inverse connection between family leadership and well being: the more prominent the wife's position as head of the family is, the lower the level of family income. This may reflect both the husband's refusal to play the leading role in the family and even his rejection of any involvement in family issues in such a family. The qualitative research revealed that both men and women see the breadwinning role to be an essential part of masculine identity, a role which the female partner would take on temporarily to assist the male but not permanently since this would threaten the gender boundaries and the man's identity. At the same time, few breadwinners expressed a sense of job satisfaction and all considered their choice as imposed on them by the circumstances (i.e. having a family in difficult times). The group feel that family orientation and some loss of personal involvement in their profession is partly reflected in the fact that many of the men felt more comfortable and self-confident at home than at work. Women's work, on the other hand, was largely seen as a source of personal and self-realisation and social life. Eight out of ten of the Russian women interviewed were employed, although only two on a full-time basis, but none saw their jobs as adding substantially to the family budget. Both partners see the most important factor as the wife's wish to work or stay at home, and do not think it wise for the wife to work at the expense of her part of the "family contract", although husbands from the "egalitarian" relationships expressed more willingness to compromise. The analysis showed clearly that wives and husbands did not construct gender boundaries in isolation, with the interviews providing clear evidence of negotiation. At the same time, husbands' interpretations of their wives' employment were less susceptible to the influence of negotiation than were their gender attitudes and norms about breadwinning. One of the most interesting aspects of the spouses' negotiations was the extent to which they disagreed about what they seemed to have agreed upon. Most disagreements about the breadwinning boundaries, however, were over norms and were settled by changes in norms rather than in behavioural interpretation. Changes in norms were often a form of peace offering or were in response in changes in circumstances. The study did show, however, that many of the efforts at cooperation and compensation were more symbolic than real and the group found the plasticity of expressed gender ideology to be one of the most striking findings of their work. They conclude that the shift towards more traditional gednder distributions of incomes and domestic chores does not automatically mean the reestablishment of a patriarchal model of family power. On the contrary, it seems to be a compromise formation, relatively unstable, temporary and containing self-defeating forces as the split between the personal and professional value of work and its social value expressed in a money equivalent cannot be maintained for generations.
Resumo:
Neurons generate spikes reliably with millisecond precision if driven by a fluctuating current--is it then possible to predict the spike timing knowing the input? We determined parameters of an adapting threshold model using data recorded in vitro from 24 layer 5 pyramidal neurons from rat somatosensory cortex, stimulated intracellularly by a fluctuating current simulating synaptic bombardment in vivo. The model generates output spikes whenever the membrane voltage (a filtered version of the input current) reaches a dynamic threshold. We find that for input currents with large fluctuation amplitude, up to 75% of the spike times can be predicted with a precision of +/-2 ms. Some of the intrinsic neuronal unreliability can be accounted for by a noisy threshold mechanism. Our results suggest that, under random current injection into the soma, (i) neuronal behavior in the subthreshold regime can be well approximated by a simple linear filter; and (ii) most of the nonlinearities are captured by a simple threshold process.
Resumo:
Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.
Resumo:
We consider nonparametric missing data models for which the censoring mechanism satisfies coarsening at random and which allow complete observations on the variable X of interest. W show that beyond some empirical process conditions the only essential condition for efficiency of an NPMLE of the distribution of X is that the regions associated with incomplete observations on X contain enough complete observations. This is heuristically explained by describing the EM-algorithm. We provide identifiably of the self-consistency equation and efficiency of the NPMLE in order to make this statement rigorous. The usual kind of differentiability conditions in the proof are avoided by using an identity which holds for the NPMLE of linear parameters in convex models. We provide a bivariate censoring application in which the condition and hence the NPMLE fails, but where other estimators, not based on the NPMLE principle, are highly inefficient. It is shown how to slightly reduce the data so that the conditions hold for the reduced data. The conditions are verified for the univariate censoring, double censored, and Ibragimov-Has'minski models.
Resumo:
BACKGROUND: Cyclic recruitment during mechanical ventilation contributes to ventilator associated lung injury. Two different pathomechanisms in acute respiratory distress syndrome (ARDS) are currently discussed: alveolar collapse vs persistent flooding of small airways and alveoli. We compare two different ARDS animal models by computed tomography (CT) to describe different recruitment and derecruitment mechanisms at different airway pressures: (i) lavage-ARDS, favouring alveolar collapse by surfactant depletion; and (ii) oleic acid ARDS, favouring alveolar flooding by capillary leakage. METHODS: In 12 pigs [25 (1) kg], ARDS was randomly induced, either by saline lung lavage or oleic acid (OA) injection, and 3 animals served as controls. A respiratory breathhold manoeuvre without spontaneous breathing at different continuous positive airway pressure (CPAP) was applied in random order (CPAP levels of 5, 10, 15, 30, 35 and 50 cm H(2)O) and spiral-CT scans of the total lung were acquired at each CPAP level (slice thickness=1 mm). In each spiral-CT the volume of total lung parenchyma, tissue, gas, non-aerated, well-aerated, poorly aerated, and over-aerated lung was calculated. RESULTS: In both ARDS models non-aerated lung volume decreased significantly from CPAP 5 to CPAP 50 [oleic acid lung injury (OAI): 346.9 (80.1) to 96.4 (48.8) ml, P<0.001; lavage-ARDS: 245 17.6) to 42.7 (4.8) ml, P<0.001]. In lavage-ARDS poorly aerated lung volume decreased at higher CPAP levels [232 (45.2) at CPAP 10 to 84 (19.4) ml at CPAP 50, P<0.001] whereas in OAI poorly aerated lung volume did not vary at different airway pressures. CONCLUSIONS: In both ARDS models well-aerated and non-aerated lung volume respond to different CPAP levels in a comparable fashion: Thus, a cyclical alveolar collapse seems to be part of the derecruitment process also in the OA-ARDS. In OA-ARDS, the increase in poorly aerated lung volume reflects the specific initial lesion, that is capillary leakage with interstitial and alveolar oedema.
Resumo:
Monte Carlo (code GEANT) produced 6 and 15 MV phase space (PS) data were used to define several simple photon beam models. For creating the PS data the energy of starting electrons hitting the target was tuned to get correct depth dose data compared to measurements. The modeling process used the full PS information within the geometrical boundaries of the beam including all scattered radiation of the accelerator head. Scattered radiation outside the boundaries was neglected. Photons and electrons were assumed to be radiated from point sources. Four different models were investigated which involved different ways to determine the energies and locations of beam particles in the output plane. Depth dose curves, profiles, and relative output factors were calculated with these models for six field sizes from 5x5 to 40x40cm2 and compared to measurements. Model 1 uses a photon energy spectrum independent of location in the PS plane and a constant photon fluence in this plane. Model 2 takes into account the spatial particle fluence distribution in the PS plane. A constant fluence is used again in model 3, but the photon energy spectrum depends upon the off axis position. Model 4, finally uses the spatial particle fluence distribution and off axis dependent photon energy spectra in the PS plane. Depth dose curves and profiles for field sizes up to 10x10cm2 were not model sensitive. Good agreement between measured and calculated depth dose curves and profiles for all field sizes was reached for model 4. However, increasing deviations were found for increasing field sizes for models 1-3. Large deviations resulted for the profiles of models 2 and 3. This is due to the fact that these models overestimate and underestimate the energy fluence at large off axis distances. Relative output factors consistent with measurements resulted only for model 4.
Resumo:
This review deals with an important aspect of organ transplantation, namely the process of psychic organ integration and organ-related fantasies. The body schema and body self are two important concepts in the integration of a transplanted organ. Different models and theories on organ integration are presented and will be discussed. There is evidence that beside the emotional impact and the influence on well-being, organ integration depends closely on psychic processes involving in the incorporation of the transplanted organ and the respective organ-related fantasies. Therefore, these organ fantasies - whether unconscious or conscious - may play an important role in the future development of the instinctive and highly individual relation the patients elaborate with the new organ. Beside the concern with the new organ, a bereavement to the lost old and sick organ may also influence the patients thoughts. Moreover, the good resolving of all these issues evokes the "good practice" patients develop towards the new situation. This will bring up issues as compliance, infections, rejection episodes and - most important - also organ survival.
Resumo:
Polycarbonate (PC) is an important engineering thermoplastic that is currently produced in large industrial scale using bisphenol A and monomers such as phosgene. Since phosgene is highly toxic, a non-phosgene approach using diphenyl carbonate (DPC) as an alternative monomer, as developed by Asahi Corporation of Japan, is a significantly more environmentally friendly alternative. Other advantages include the use of CO2 instead of CO as raw material and the elimination of major waste water production. However, for the production of DPC to be economically viable, reactive-distillation units are needed to obtain the necessary yields by shifting the reaction-equilibrium to the desired products and separating the products at the point where the equilibrium reaction occurs. In the field of chemical reaction engineering, there are many reactions that are suffering from the low equilibrium constant. The main goal of this research is to determine the optimal process needed to shift the reactions by using appropriate control strategies of the reactive distillation system. An extensive dynamic mathematical model has been developed to help us investigate different control and processing strategies of the reactive distillation units to increase the production of DPC. The high-fidelity dynamic models include extensive thermodynamic and reaction-kinetics models while incorporating the necessary mass and energy balance of the various stages of the reactive distillation units. The study presented in this document shows the possibility of producing DPC via one reactive distillation instead of the conventional two-column, with a production rate of 16.75 tons/h corresponding to start reactants materials of 74.69 tons/h of Phenol and 35.75 tons/h of Dimethyl Carbonate. This represents a threefold increase over the projected production rate given in the literature based on a two-column configuration. In addition, the purity of the DPC produced could reach levels as high as 99.5% with the effective use of controls. These studies are based on simulation done using high-fidelity dynamic models.
Resumo:
Virtual machines emulating hardware devices are generally implemented in low-level languages and using a low-level style for performance reasons. This trend results in largely difficult to understand, difficult to extend and unmaintainable systems. As new general techniques for virtual machines arise, it gets harder to incorporate or test these techniques because of early design and optimization decisions. In this paper we show how such decisions can be postponed to later phases by separating virtual machine implementation issues from the high-level machine-specific model. We construct compact models of whole-system VMs in a high-level language, which exclude all low-level implementation details. We use the pluggable translation toolchain PyPy to translate those models to executables. During the translation process, the toolchain reintroduces the VM implementation and optimization details for specific target platforms. As a case study we implement an executable model of a hardware gaming device. We show that our approach to VM building increases understandability, maintainability and extendability while preserving performance.
Resumo:
Systems must co-evolve with their context. Reverse engineering tools are a great help in this process of required adaption. In order for these tools to be flexible, they work with models, abstract representations of the source code. The extraction of such information from source code can be done using a parser. However, it is fairly tedious to build new parsers. And this is made worse by the fact that it has to be done over and over again for every language we want to analyze. In this paper we propose a novel approach which minimizes the knowledge required of a certain language for the extraction of models implemented in that language by reflecting on the implementation of preparsed ASTs provided by an IDE. In a second phase we use a technique referred to as Model Mapping by Example to map platform dependent models onto domain specific model.
Resumo:
In the laboratory of Dr. Dieter Jaeger at Emory University, we use computer simulations to study how the biophysical properties of neurons—including their three-dimensional structure, passive membrane resistance and capacitance, and active membrane conductances generated by ion channels—affect the way that the neurons transfer synaptic inputs into the action potential streams that represent their output. Because our ultimate goal is to understand how neurons process and relay information in a living animal, we try to make our computer simulations as realistic as possible. As such, the computer models reflect the detailed morphology and all of the ion channels known to exist in the particular neuron types being simulated, and the model neurons are tested with synaptic input patterns that are intended to approximate the inputs that real neurons receive in vivo. The purpose of this workshop tutorial was to explain what we mean by ‘in vivo-like’ synaptic input patterns, and how we introduce these input patterns into our computer simulations using the freely available GENESIS software package (http://www.genesis-sim.org/GENESIS). The presentation was divided into four sections: first, an explanation of what we are talking about when we refer to in vivo-like synaptic input patterns
Resumo:
The economic and social changes taking place in Russia in recent decades have implied a restructuring of the Russian society. Among other things, Russian leaders have expressed a need for the reorientation of social development. In the 1990’s, cooperation was initiated on a number of social work and social welfare projects with international support, a process further speeded up during President Jeltsin’s state visit to Sweden in 1997. Discussions between the Swedish International Development Cooperation Agency (Sida) and the Russian authorities dealing with welfare issues started from the assumption that Russian professional social work was weak and needed to be strengthened. In the 1990's Sida was also given a stronger general mandate to work with other former Soviet countries in Eastern Europe, for example the Baltic States. The Russian-Swedish discussions resulted in projects aiming to raise social work competencies in public authorities, managements and among social workers in Russia. One of the areas chosen for these projects was Saint Petersburg, where several projects aiming to develop new models for social work were launched. The point of departure has been to transfer and adjust Swedish models of social work to the Russian context. The Stockholm University Department of Social Work became responsible for a number of such projects and besides using academic teachers also involved a number of practitioners, such as social workers in disablement services and reformatory staff who could meet and match Russian authorities and partners.
Resumo:
Simulation techniques are almost indispensable in the analysis of complex systems. Materials- and related information flow processes in logistics often possess such complexity. Further problem arise as the processes change over time and pose a Big Data problem as well. To cope with these issues adaptive simulations are more and more frequently used. This paper presents a few relevant advanced simulation models and intro-duces a novel model structure, which unifies modelling of geometrical relations and time processes. This way the process structure and their geometric relations can be handled in a well understandable and transparent way. Capabilities and applicability of the model is also presented via a demonstrational example.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.