820 resultados para approach to information systems
Resumo:
Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^
Resumo:
In 1996 and in 1997, Congress ordered the Secretary of Health and Human Services to undertake a process of negotiated rulemaking, which is authorized under the Negotiated Rulemaking Act of 1990, on three separate rulemaking matters. Other Federal agencies, including the Environmental Protection Agency and the Occupational Health and Safety Administration, have also made use of this procedure. As part of the program to reinvent government, President Clinton has issued an executive order requiring federal agencies to engage in some negotiated rulemaking procedures. I present an analytic, interpretative and critical approach to looking at the statutory and regulatory provisions for negotiated rulemaking as related to issues of democratic governance surrounding the problem of delegation of legislative power. The paradigm of law delineated by Jürgen Habermas, which sets law the task of achieving social or value integration as well as integration of systems, provides the background theory for a critique of such processes. My research questions are two. First, why should a citizen obey a regulation which is the result of negotiation by directly interested parties? Second, what is the potential effect of negotiated rulemaking on other institutions for deliberative democracy? For the internal critique I argue that the procedures for negotiated rulemaking will not produce among the participants the agreement and cooperation which is the legislative intent. For the external critique I argue that negotiated rulemaking will not result in democratically-legitimated regulation. In addition, the practice of negotiated rulemaking will further weaken the functioning of the public sphere, as Habermas theorizes it, as the central institution of deliberative democracy. The primary implication is the need to mitigate further development of administrative agencies as isolated, self-regulating systems, which have been loosened from the controls of democratic governance, through the development of a robust public sphere in which affected persons may achieve mutual understanding. ^
Resumo:
Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.
Resumo:
Neospora caninum is an apicomplexan parasite which has emerged as an important cause of bovine abortion worldwide. Abortion is usually triggered by reactivation of dormant bradyzoites during pregnancy and subsequent congenital infection of the foetus, where the central nervous system appears to be most frequently affected. We here report on an organotypic tissue culture model for Neospora infection which can be used to study certain aspects of the cerebral phase of neosporosis within the context of a three-dimensionally organised neuronal network. Organotypic slice cultures of rat cortical tissue were infected with N. caninum tachyzoites, and the kinetics of parasite proliferation, as well as the proliferation-inhibitory effect of interferon-gamma (IFN-gamma), were monitored by either immunofluorescence, transmission electron microscopy, and a quantitative PCR-assay using the LightCycler instrument, respectively. In addition, the neuronal cytoskeletal elements, namely glial acidic protein filaments as well as actin microfilament bundles were shown to be largely colocalising with the pseudocyst periphery. This organotypic culture model for cerebral neosporosis provides a system, which is useful to study the proliferation, ultrastructural characteristics, development, and the interactions of N. caninum within the context of neuronal tissue, which at the same time can be modulated and influenced under controlled conditions, and will be useful in the future to gain more information on the cerebral phase of neosporosis.
Resumo:
Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.
Resumo:
We report a trace element - Pb isotope analytical (LIA) database on the "Singen Copper", a peculiar type of copper found in the North Alpine realm, from its type locality, the Early Bronze Age Singen Cemetery (Germany). What distinguishes “Singen Copper” from other coeval copper types? (i) is it a discrete metal lot with a uniform provenance (if so, can its provenance be constrained)? (ii) was it manufactured by a special, unique metallurgical process that can be discriminated from others? Trace element concentrations can give clues on the ore types that were mined, but they can be modified (more or less intentionally) by metallurgical operations. A more robust indicator are the ratios of chemically similar elements (e.g. Co/Ni, Bi/Sb, etc.), since they should remain nearly constant during metallurgical operations, and are expected to behave homogeneously in each mineral of a given mining area, but their partition amongst the different mineral species is known to cause strong inter-element fractionations. We tested the trace element ratio pattern predicted by geochemical arguments on the Brixlegg mining area. Brixlegg itself is not compatible with the Singen Copper objects, and we only report it because it is a rare instance of a mining area for which sufficient trace element analyses are available in the literature. We observe that As/Sb in fahlerz varies by a factor 1.8 above/below median; As/Sb in enargite varies by a factor of 2.5 with a 10 times higher median. Most of the 102 analyzed metal objects from Singen are Sb-Ni-rich, corresponding to “antimony-nickel copper” of the literature. Other trace element concentrations vary by > 100 times, ratios by factors > 50. Pb isotopic compositions are all significantly different from each other. They do not form a single linear array and require > 3 ore batches that certainly do not derive from one single mining area. Our data suggest a heterogeneous provenance of “Singen copper”. Archaeological information limits the scope to Central European sources. LIA requires a diverse supply network from many mining localities, including possibly Brittany. Trace element ratios show more heterogeneity than LIA; this can be explained either by deliberate selection of one particular ore mineral (from very many sources) or by processing of assorted ore minerals from a smaller number of sources, with the unintentional effect that the quality of the copper would not be constant, as the metallurgical properties of alloys would vary with trace element concentrations.
Resumo:
White matter connects different brain areas and applies electrical insulation to the neuron’s axons with myelin sheaths in order to enable quick signal transmission. Due to its modulatory properties in signal conduction, white matter plays an essential role in learning, cognition and psychiatric disorders (Fields, 2008a). In respect thereof, the non-invasive investigation of white matter anatomy and function in vivo provides the unique opportunity to explore the most complex organ of our body. Thus, the present thesis aimed to apply a multimodal neuroimaging approach to investigate different white matter properties in psychiatric and healthy populations. On the one hand, white matter microstructural properties were investigated in a psychiatric population; on the other hand, white matter metabolic properties were assessed in healthy adults providing basic information about the brain’s wiring entity. As a result, three research papers are presented here. The first paper assessed the microstructural properties of white matter in relation to a frequent epidemiologic finding in schizophrenia. As a result, reduced white matter integrity was observed in patients born in summer and autumn compared to patients born in winter and spring. Despite the large genetic basis of schizophrenia, accumulating evidence indicates that environmental exposures may be implicated in the development of schizophrenia (A. S. Brown, 2011). Notably, epidemiologic studies have shown a 5–8% excess of births during winter and spring for patients with schizophrenia on the Northern Hemisphere at higher latitudes (Torrey, Miller, Rawlings, & Yolken, 1997). Although the underlying mechanisms are unclear, the seasonal birth effect may indicate fluctuating environmental risk factors for schizophrenia. Thus, exposure to harmful factors during foetal development may result in the activation of pathologic neural circuits during adolescence or young adulthood, increasing the risk of schizophrenia (Fatemi & Folsom, 2009). While white matter development starts during the foetal period and continues until adulthood, its major development is accomplished by the age of two years (Brody, Kinney, Kloman, & Gilles, 1987; Huang et al., 2009). This indicates a vulnerability period of white matter that may coincide with the fluctuating environmental risk factors for schizophrenia. Since microstructural alterations of white matter in schizophrenia are frequently observed, the current study provided evidence for the neurodevelopmental hypothesis of schizophrenia. In the second research paper, the perfusion of white matter showed a positive correlation between white matter microstructure and its perfusion with blood across healthy adults. This finding was in line with clinical studies indicating a tight coupling between cerebral perfusion and WM health across subjects (Amann et al., 2012; Chen, Rosas, & Salat, 2013; Kitagawa et al., 2009). Although relatively little is known about the metabolic properties of white matter, different microstructural properties, such as axon diameter and myelination, might be coupled with the metabolic demand of white matter. Furthermore, the ability to detect perfusion signal in white matter was in accordance with a recent study showing that technical improvements, such as pseudo-continuous arterial spin labeling, enabled the reliable detection of white matter perfusion signal (van Osch et al., 2009). The third paper involved a collaboration within the same department to assess the interrelation between functional connectivity networks and their underlying structural connectivity.
Resumo:
Given a short-arc optical observation with estimated angle-rates, the admissible region is a compact region in the range / range-rate space defined such that all likely and relevant orbits are contained within it. An alternative boundary value problem formulation has recently been proposed where range / range hypotheses are generated with two angle measurements from two tracks as input. In this paper, angle-rate information is reintroduced as a means to eliminate hypotheses by bounding their constants of motion before a more computationally costly Lambert solver or differential correction algorithm is run.
Resumo:
Information systems (IS) outsourcing projects often fail to achieve initial goals. To avoid project failure, managers need to design formal controls that meet the specific contextual demands of the project. However, the dynamic and uncertain nature of IS outsourcing projects makes it difficult to design such specific formal controls at the outset of a project. It is hence crucial to translate high-level project goals into specific formal controls during the course of a project. This study seeks to understand the underlying patterns of such translation processes. Based on a comparative case study of four outsourced software development projects, we inductively develop a process model that consists of three unique patterns. The process model shows that the performance implications of emergent controls with higher specificity depend on differences in the translation process. Specific formal controls have positive implications for goal achievement if only the stakeholder context is adapted, while they are negative for goal achievement if in the translation process tasks are unintendedly adapted. In the latter case projects incrementally drift away from their initial direction. Our findings help to better understand control dynamics in IS outsourcing projects. We contribute to a process theoretic understanding of IS outsourcing governance and we derive implications for control theory and the IS project escalation literature.
Resumo:
This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
Most commercial project management software packages include planning methods to devise schedules for resource-constrained projects. As it is proprietary information of the software vendors which planning methods are implemented, the question arises how the software packages differ in quality with respect to their resource-allocation capabilities. We experimentally evaluate the resource-allocation capabilities of eight recent software packages by using 1,560 instances with 30, 60, and 120 activities of the well-known PSPLIB library. In some of the analyzed packages, the user may influence the resource allocation by means of multi-level priority rules, whereas in other packages, only few options can be chosen. We study the impact of various complexity parameters and priority rules on the project duration obtained by the software packages. The results indicate that the resource-allocation capabilities of these packages differ significantly. In general, the relative gap between the packages gets larger with increasing resource scarcity and with increasing number of activities. Moreover, the selection of the priority rule has a considerable impact on the project duration. Surprisingly, when selecting a priority rule in the packages where it is possible, both the mean and the variance of the project duration are in general worse than for the packages which do not offer the selection of a priority rule.
Resumo:
Soils provide us with over 90% of all human food, livestock feed, fibre and fuel on Earth. Soils, however, have more than just productive functions. The key challenge in coming years will be to address the diverse and potentially conflicting demands now being made by human societies and other forms of life, while ensuring that future generations have the same potential to use soils and land of comparable quality. In a multi-level stakeholder approach, down-to-earth action will have to be supplemented with measures at various levels, from households to communities, and from national policies to international conventions. Knowledge systems, both indigenous and scientific, and related research and learning processes must play a central role. Ongoing action can be enhanced through a critical assessment of the impact of past achievements, and through better cooperation between people and institutions.
Resumo:
The attentional blink (AB) is a fundamental limitation of the ability to select relevant information from irrelevant information. It can be observed with the detection rate in an AB task as well as with the corresponding P300 amplitude of the event-related potential. In previous research, however, correlations between these two levels of observation were weak and rather inconsistent. A possible explanation of this finding might be that multiple processes underlie the AB and, thus, obscure a possible relationship between AB-related detection rate and the corresponding P300 amplitude. The present study investigated this assumption by applying a fixed-links modeling approach to represent behavioral individual differences in the AB as a latent variable. Concurrently, this approach enabled us to control for additional sources of variance in AB performance by deriving two additional latent variables. The correlation between the latent variable reflecting behavioral individual differences in AB magnitude and a corresponding latent variable derived from the P300 amplitude was high (r=.70). Furthermore, this correlation was considerably stronger than the correlations of other behavioral measures of the AB magnitude with their psychophysiological counterparts (all rs<.40). Our findings clearly indicate that the systematic disentangling of various sources of variance by utilizing the fixed-links modeling approach is a promising tool to investigate behavioral individual differences in the AB and possible psychophysiological correlates of these individual differences.
Resumo:
Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.
Resumo:
Can the early identification of the species of staphylococcus responsible for infection by the use of Real Time PCR technology influence the approach to the treatment of these infections? ^ This study was a retrospective cohort study in which two groups of patients were compared. The first group, ‘Physician Aware’ consisted of patients in whom physicians were informed of specific staphylococcal species and antibiotic sensitivity (using RT-PCR) at the time of notification of the gram stain. The second group, ‘Physician Unaware’ consisted of patients in whom treating physicians received the same information 24–72 hours later as a result of blood culture and antibiotic sensitivity determination. ^ The approach to treatment was compared between ‘Physician Aware’ and ‘Physician Unaware’ groups for three different microbiological diagnoses—namely MRSA, MSSA and no-SA (or coagulase negative Staphylococcus). ^ For a diagnosis of MRSA, the mean time interval to the initiation of Vancomycin therapy was 1.08 hours in the ‘Physician Aware’ group as compared to 5.84 hours in the ‘Physician Unaware’ group (p=0.34). ^ For a diagnosis of MSSA, the mean time interval to the initiation of specific anti-MSSA therapy with Nafcillin was 5.18 hours in the ‘Physician Aware’ group as compared to 49.8 hours in the ‘Physician Unaware’ group (p=0.007). Also, for the same diagnosis, the mean duration of empiric therapy in the ‘Physician Aware’ group was 19.68 hours as compared to 80.75 hours in the ‘Physician Unaware’ group (p=0.003) ^ For a diagnosis of no-SA or coagulase negative staphylococcus, the mean duration of empiric therapy was 35.65 hours in the ‘Physician Aware’ group as compared to 44.38 hours in the ‘Physician Unaware’ group (p=0.07). However, when treatment was considered a categorical variable and after exclusion of all cases where anti-MRS therapy was used for unrelated conditions, only 20 of 72 cases in the ‘Physician Aware’ group received treatment as compared to 48 of 106 cases in the ‘Physician Unaware’ group. ^ Conclusions. Earlier diagnosis of MRSA may not alter final treatment outcomes. However, earlier identification may lead to the earlier institution of measures to limit the spread of infection. The early diagnosis of MSSA infection, does lead to treatment with specific antibiotic therapy at an earlier stage of treatment. Also, the duration of empiric therapy is greatly reduced by early diagnosis. The early diagnosis of coagulase negative staphylococcal infection leads to a lower rate of unnecessary treatment for these infections as they are commonly considered contaminants. ^