896 resultados para Simulation Systems Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software for use with patient records is challenging to design and difficult to evaluate because of the tremendous variability of patient circumstances. A method was devised by the authors to overcome a number of difficulties. The method evaluates and compares objectively various software products for use in emergency departments and compares software to conventional methods like dictation and templated chart forms. The technique utilizes oral case simulation and video recording for analysis. The methodology and experiences of executing a study using this case simulation are discussed in this presentation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The behavior of sample components whose pI values are outside the pH gradient established by 101 hypothetical biprotic carrier ampholytes covering a pH 6-8 range was investigated by computer simulation under constant current conditions with concomitant constant electroosmosis toward the cathode. Data obtained with the sample being applied between zones of carrier ampholytes and on the anodic side of the carrier ampholytes were studied and found to evolve into zone structures comprising three regions between anolyte and catholyte. The focusing region with the pH gradient is bracketed by two isotachopheretic zone structures comprising selected sample and carrier components as isotachophoretic zones. The isotachophoretic structures electrophoretically migrate in opposite direction and their lengths increase with time due to the gradual isotachophoretic decay at the pH gradient edges. Due to electroosmosis, however, the overall pattern is being transported toward the cathode. Sample components whose pI values are outside the established pH gradient are demonstrated to form isotachophoretic zones behind the leading cation of the catholyte (components with pI values larger than 8) and the leading anion of the anolyte (components with pI values smaller than 6). Amphoteric compounds with appropriate pI values or nonamphoteric components can act as isotachophoretic spacer compounds between sample compounds or between the leader and the sample with the highest mobility. The simulation data obtained provide for the first time insight into the dynamics of amphoteric sample components that do not focus within the established pH gradient.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A reliable and robust routing service for Flying Ad-Hoc Networks (FANETs) must be able to adapt to topology changes, and also to recover the quality level of the delivered multiple video flows under dynamic network topologies. The user experience on watching live videos must also be satisfactory even in scenarios with network congestion, buffer overflow, and packet loss ratio, as experienced in many FANET multimedia applications. In this paper, we perform a comparative simulation study to assess the robustness, reliability, and quality level of videos transmitted via well-known beaconless opportunistic routing protocols. Simulation results shows that our developed protocol XLinGO achieves multimedia dissemination with Quality of Experience (QoE) support and robustness in a multi-hop, multi-flow, and mobile networks, as required in many multimedia FANET scenarios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider a large quantum system with spins 12 whose dynamics is driven entirely by measurements of the total spin of spin pairs. This gives rise to a dissipative coupling to the environment. When one averages over the measurement results, the corresponding real-time path integral does not suffer from a sign problem. Using an efficient cluster algorithm, we study the real-time evolution from an initial antiferromagnetic state of the two-dimensional Heisenberg model, which is driven to a disordered phase, not by a Hamiltonian, but by sporadic measurements or by continuous Lindblad evolution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using quantum Monte Carlo, we study the nonequilibrium transport of magnetization in large open strongly correlated quantum spin-12 systems driven by purely dissipative processes that conserve the uniform or staggered magnetization, disregarding unitary Hamiltonian dynamics. We prepare both a low-temperature Heisenberg ferromagnet and an antiferromagnet in two parts of the system that are initially isolated from each other. We then bring the two subsystems in contact and study their real-time dissipative dynamics for different geometries. The flow of the uniform or staggered magnetization from one part of the system to the other is described by a diffusion equation that can be derived analytically.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Amplification of human chromosome 20q DNA is the most frequently occurring chromosomal abnormality detected in sporadic colorectal carcinomas and shows significant correlation with liver metastases. Through comprehensive high-resolution microarray comparative genomic hybridization and microarray gene expression profiling, we have characterized chromosome 20q amplicon genes associated with human colorectal cancer metastasis in two in vitro metastasis model systems. The results revealed increasing complexity of the 20q genomic profile from the primary tumor-derived cell lines to the lymph node and liver metastasis derived cell lines. Expression analysis of chromosome 20q revealed a subset of over expressed genes residing within the regions of genomic copy number gain in all the tumor cell lines, suggesting these are Chromosome 20q copy number responsive genes. Bases on their preferential expression levels in the model system cell lines and known biological function, four of the over expressed genes mapping to the common intervals of genomic copy gain were considered the most promising candidate colorectal metastasis-associated genes. Validation of genomic copy number and expression array data was carried out on these genes, with one gene, DNMT3B, standing out as expressed at a relatively higher levels in the metastasis-derived cell lines compared with their primary-derived counterparts in both the models systems analyzed. The data provide evidence for the role of chromosome 20q genes with low copy gain and elevated expression in the clonal evolution of metastatic cells and suggests that such genes may serve as early biomarkers of metastatic potential. The data also support the utility of the combined microarray comparative genomic hybridization and expression array analysis for identifying copy number responsive genes in areas of low DNA copy gain in cancer cells. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Interim clinical trial monitoring procedures were motivated by ethical and economic considerations. Classical Brownian motion (Bm) techniques for statistical monitoring of clinical trials were widely used. Conditional power argument and α-spending function based boundary crossing probabilities are popular statistical hypothesis testing procedures under the assumption of Brownian motion. However, it is not rare that the assumptions of Brownian motion are only partially met for trial data. Therefore, I used a more generalized form of stochastic process, called fractional Brownian motion (fBm), to model the test statistics. Fractional Brownian motion does not hold Markov property and future observations depend not only on the present observations but also on the past ones. In this dissertation, we simulated a wide range of fBm data, e.g., H = 0.5 (that is, classical Bm) vs. 0.5< H <1, with treatment effects vs. without treatment effects. Then the performance of conditional power and boundary-crossing based interim analyses were compared by assuming that the data follow Bm or fBm. Our simulation study suggested that the conditional power or boundaries under fBm assumptions are generally higher than those under Bm assumptions when H > 0.5 and also matches better with the empirical results. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An interim analysis is usually applied in later phase II or phase III trials to find convincing evidence of a significant treatment difference that may lead to trial termination at an earlier point than planned at the beginning. This can result in the saving of patient resources and shortening of drug development and approval time. In addition, ethics and economics are also the reasons to stop a trial earlier. In clinical trials of eyes, ears, knees, arms, kidneys, lungs, and other clustered treatments, data may include distribution-free random variables with matched and unmatched subjects in one study. It is important to properly include both subjects in the interim and the final analyses so that the maximum efficiency of statistical and clinical inferences can be obtained at different stages of the trials. So far, no publication has applied a statistical method for distribution-free data with matched and unmatched subjects in the interim analysis of clinical trials. In this simulation study, the hybrid statistic was used to estimate the empirical powers and the empirical type I errors among the simulated datasets with different sample sizes, different effect sizes, different correlation coefficients for matched pairs, and different data distributions, respectively, in the interim and final analysis with 4 different group sequential methods. Empirical powers and empirical type I errors were also compared to those estimated by using the meta-analysis t-test among the same simulated datasets. Results from this simulation study show that, compared to the meta-analysis t-test commonly used for data with normally distributed observations, the hybrid statistic has a greater power for data observed from normally, log-normally, and multinomially distributed random variables with matched and unmatched subjects and with outliers. Powers rose with the increase in sample size, effect size, and correlation coefficient for the matched pairs. In addition, lower type I errors were observed estimated by using the hybrid statistic, which indicates that this test is also conservative for data with outliers in the interim analysis of clinical trials.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the multiple case-study was to determine how hospital subsystems (such as physician monitoring and credentialing; quality assurance; risk management; and peer review) were supporting the monitoring of physicians? Three large metropolitan hospitals in Texas were studied and designated as hospitals #1, #2, and #3. Realizing that hospital subsystems are a unique entity and part of a larger system, conclusions were made on the premises of a quality control system, in relation to the tools of government (particularly the Health Care Quality Improvement Act (HCQIA)), and in relation to itself as a tool of a hospital.^ Three major analytical assessments were performed. First, the subsystems were analyzed as to their "completeness"; secondly, the subsystems were analyzed for "performance"; and thirdly, the subsystems were analyzed in reference to the interaction of completeness and performance.^ The physician credentialing and monitoring and the peer review subsystems as quality control systems were most complete, efficient, and effective in hospitals #1 and #3. The HCQIA did not seem to be an influencing factor in the completeness of the subsystem in hospital #1. The quality assurance and risk management subsystem in hospital #2 was not representative of completeness and performance and the HCQIA was not an influencing factor in the completeness of the Q.A. or R.M. systems in any hospital. The efficiency (computerization) of the physician credentialing, quality assurance and peer review subsystems in hospitals #1 and #3 seemed to contribute to their effectiveness (system-wide effect).^ The results indicated that the more complete, effective, and efficient subsystems were characterized by (1) all defined activities being met, (2) the HCQIA being an influencing factor, (3) a decentralized administrative structure, (4) computerization an important element, and (5) staff was sophisticated in subsystem operations. However, other variables were identified which deserve further research as to their effect on completeness and performance of subsystems. They include (1) medical staff affiliations, (2) system funding levels, (3) the system's administrative structure, and (4) the physician staff "cultural" characteristics. Perhaps by understanding other influencing factors, health care administrators may plan subsystems that will be compatible with legislative requirements and administrative objectives. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Three sediment cores from the Bragança Peninsula located in the coastal region in the north-eastern portion of Pará State have been studied by pollen analysis to reconstruct Holocene environmental changes and dynamics of the mangrove ecosystem. The cores were taken from an Avicennia forest (Bosque de Avicennia (BDA)), a salt marsh area (Campo Salgado (CS)) and a Rhizophora dominated area (Furo do Chato). Pollen traps were installed in five different areas of the peninsula to study modern pollen deposition. Nine accelerator mass spectrometry radiocarbon dates provide time control and show that sediment deposits accumulated relatively undisturbed. Mangrove vegetation started to develop at different times at the three sites: at 5120 14C yr BP at the CS site, at 2170 14C yr BP at the BDA site and at 1440 14C yr BP at the FDC site. Since mid Holocene times, the mangroves covered even the most elevated area on the peninsula, which is today a salt marsh, suggesting somewhat higher relative sea-levels. The pollen concentration in relatively undisturbed deposits seems to be an indicator for the frequency of inundation. The tidal inundation frequency decreased, probably related to lower sea-levels, during the late Holocene around 1770 14C yr BP at BDA, around 910 14C yr BP at FDC and around 750 14C yr BP at CS. The change from a mangrove ecosystem to a salt marsh on the higher elevation, around 420 14C yr BP is probably natural and not due to an anthropogenic impact. Modern pollen rain from different mangrove types show different ratios between Rhizophora and Avicennia pollen, which can be used to reconstruct past composition of the mangrove. In spite of bioturbation and especially tidal inundation, which change the local pollen deposition within the mangrove zone, past mangrove dynamics can be reconstructed. The pollen record for BDA indicates a mixed Rhizophora/Avicennia mangrove vegetation between 2170 and 1770 14C yr BP. Later Rhizophora trees became more frequent and since ca. 200 14C yr BP Avicennia dominated in the forest.