943 resultados para software analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Costs and environmental impacts are key elements in forest logistics and they must be integrated in forest decision-making. The evaluation of transportation fuel costs and carbon emissions depend on spatial and non-spatial data but in many cases the former type of data are dicult to obtain. On the other hand, the availability of software tools to evaluate transportation fuel consumption as well as costs and emissions of carbon dioxide is limited. We developed a software tool that combines two empirically validated models of truck transportation using Digital Elevation Model (DEM) data and an open spatial data tool, specically OpenStreetMap©. The tool generates tabular data and spatial outputs (maps) with information regarding fuel consumption, cost and CO2 emissions for four types of trucks. It also generates maps of the distribution of transport performance indicators (relation between beeline and real road distances). These outputs can be easily included in forest decision-making support systems. Finally, in this work we applied the tool in a particular case of forest logistics in north-eastern Portugal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de estágio apresentada para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Sistemas de Informação Organizacionais

Relevância:

30.00% 30.00%

Publicador:

Resumo:

anogi computes the "Analysis of Gini" for population sub-groups proposed by Frick et al. (2006; Sociological Methods and Research 34/4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

wgttest performs a test proposed by DuMouchel and Duncan (1983) to evaluate whether the weighted and unweighted estimates of a regression model are significantly different.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Null dereferencing is one of the most frequent bugs in Java systems causing programs to crash due to the uncaught NullPointerException. Developers often fix this bug by introducing a guard (i.e., null check) on the potentially-null objects before using them. In this paper we investigate the null checks in 717 open-source Java systems to understand when and why developers introduce null checks. We find that 35 of the if-statements are null checks. A deeper investigation shows that 71 of the checked-for-null objects are returned from method calls. This indicates that null checks have a serious impact on performance and that developers introduce null checks when they use methods that return null.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of analogue model experiments in geology is to simulate structures in nature under specific imposed boundary conditions using materials whose rheological properties are similar to those of rocks in nature. In the late 1980s, X-ray computed tomography (CT) was first applied to the analysis of such models. In early studies only a limited number of cross-sectional slices could be recorded because of the time involved in CT data acquisition, the long cooling periods for the X-ray source and computational capacity. Technological improvements presently allow an almost unlimited number of closely spaced serial cross-sections to be acquired and calculated. Computer visualization software allows a full 3D analysis of every recorded stage. Such analyses are especially valuable when trying to understand complex geological structures, commonly with lateral changes in 3D geometry. Periodic acquisition of volumetric data sets in the course of the experiment makes it possible to carry out a 4D analysis of the model, i.e. 3D analysis through time. Examples are shown of 4D analysis of analogue models that tested the influence of lateral rheological changes on the structures obtained in contractional and extensional settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Arctic Ocean System is a key player regarding the climatic changes of Earth. Its highly sensitive ice Cover, the exchange of surface and deep water masses with the global ocean and the coupling with the atmosphere interact directly with global climatic changes. The output of cold, polar water and sea ice influences the production of deep water in the North Atlantic and controls the global ocean circulation ("the conveyor belt"). The Arctic Ocean is surrounded by the large Northern Hemisphere ice sheets which not only affect the sedimentation in the Arctic Ocean but also are supposed to induce the Course of glacials and interglacials. Terrigenous sediment delivered from the ice sheets by icebergs and meltwater as well as through sea ice are major components of Arctic Ocean sediments. Hence, the terrigenous content of Arctic Ocean sediments is an outstanding archive to investigate changes in the paleoenvironment. Glazigenic sediments of the Canadian Arctic Archipelago and surface samples of the Arctic Ocean and the Siberian shelf regions were investigated by means of x-ray diffraction of the bulk fraction. The source regions of distinct mineral compositions were to be deciphered. Regarding the complex circumpolar geology stable christalline shield rocks, active and ancient fold belts including magmatic and metamorphic rocks, sedimentary rocks and wide periglacial lowlands with permafrost provide a complete range of possible mineral combinations. Non- glaciated shelf regions mix the local input from a possible point source of a particular mineral combination with the whole shelf material and function as a sampler of the entire region draining to the shelf. To take this into account, a literature research was performed. Descriptions of outcropping lithologies and Arctic Ocean sediments were scanned for their mineral association. The analyses of glazigenic and shelf sediments yielded a close relationship between their mineral composition and the adjacent source region. The most striking difference between the circumpolar source regions is the extensive outcrop of carbonate rocks in the vicinity of the Canadian Arctic Archipelago and in N Greenland while siliciclastic sediments dominate the Siberian shelves. In the Siberian shelf region the eastern Kara Sea and the western Laptev Sea form a destinct region defined by high smectite, (clino-) pyroxene and plagioclase input. The source of this signal are the extensive outcrops of the Siberian trap basalt in the Putorana Plateau which is drained by the tributaries of the Yenissei and Khatanga. The eastern Laptev Sea and the East Siberian Sea can also be treated as one source region containing a feldspar, quartz, illite, mica, and chlorite asscciation combined with the trace minerals hornblende and epidote. Franz Josef Land provides a mineral composition rich in quartz and kaolinite. The diverse rock suite of the Svalbard archipelago distributes specific mineral compositions of highly metamorphic christalline rocks, dolomite-rich carbonate rocks and sedimentary rocks with a higher diagenetic potential manifested in stable newly built diagenetic minerals and high organic maturity. To reconstruct the last 30,000 years as an example of the transition between glacial and interglacial conditions a profile of sediment cores, recovered during the RV Polarstern" expedition ARK-VIIIl3 (ARCTIC '91), and additional sediment cores around Svalbard were investigated. Besides the mineralogy of different grain size fractions several additional sedimentological and organo-geochemical Parameterswere used. A detailed stratigraphic framework was achieved. By exploiting this data set changes in the mineral composition of the Eurasian Basin sediments can be related to climatic changes. Certain mineral compositions can even be associated with particular transport processes, e.g. the smectitel pyroxene association with sea ice transport from the eastern Kara Sea and the western Laptev Sea. Hence, it is possible to decipher the complex interplay between the influx of warm Atlantic waters into the Southwest of the Eurasian Basin, the waxing and waning of the Svalbard1Barents- Sea- and Kara-Sea-Ice-Sheets, the flooding of the Siberian shelf regions and the surface and deep water circulation. Until now the Arctic Ocean was assumed to be a rather stable System during the last 30,000 years which only switched from a completely ice covered situation during the glacial to seasonally Open waters during the interglacial. But this work using mineral assemblages of sediment cores in the vicinity of Svalbard revealed fast changes in the inflow of warm Atlantic water with the Westspitsbergen Current (< 1000 years), short periods of advances and retreats of the marine based Eurasian ice sheets (1000-3000 years), and short melting phases (400 years?). Deglaciation of the marine-based Eurasian and the land-based north American and Greenland ice sheets are not simultaneous. This thesis postulates that the Kara Sea Ice Sheet released an early meltwater signal prior to 15,000 14C years leading the Barents Sea Ice Sheet while the western land-based ice sheets are following later than 13,500 14C years. The northern Eurasian Basin records the shift between iceberg and sea-ice material derived from the Canadian Arctic Archipelago and N-Greenland and material transported by sea-ice and surface currents from the Siberian shelf region. The phasing of the deglaciation becomes very obvious using the dolomite and quartd phyllosilicate record. It is also supposed that the flooding of the Laptev Sea during the Holocene is manifested in a stepwise increase of sediment input at the Lomonosov Ridge between the Eurasian and Amerasian Basin. Depending on the strength of meltwater pulses from the adjacent ice sheets the Transpolar Drift can probably be relocated. These movements are traceable by the distribution of indicator minerals. Based on the outcome of this work the feasibility of bulk mineral determination can be qualified as excellent tool for paleoenvironmental reconstructions in the Arctic Ocean. The easy preparation and objective determination of bulk mineralogy provided by the QUAX software bears the potential to use this analyses as basic measuring method preceding more time consuming and highly specialised mineralogical investigations (e.g. clay mineralogy, heavy mineral determination).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bibliography: p. 23-34.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hydrogel intervertebral disc (lVD) model consisting of an inner nucleus core and an outer anulus ring was manufactured from 30 and 35% by weight Poly(vinyl alcohol) hydrogel (PVA-H) concentrations and subjected to axial compression in between saturated porous endplates at 200 N for 11 h, 30 min. Repeat experiments (n = 4) on different samples (N = 2) show good reproducibility of fluid loss and axial deformation. An axisymmetric nonlinear poroelastic finite element model with variable permeability was developed using commercial finite element software to compare axial deformation and predicted fluid loss with experimental data. The FE predictions indicate differential fluid loss similar to that of biological IVDs, with the nucleus losing more water than the anulus, and there is overall good agreement between experimental and finite element predicted fluid loss. The stress distribution pattern indicates important similarities with the biological lVD that includes stress transference from the nucleus to the anulus upon sustained loading and renders it suitable as a model that can be used in future studies to better understand the role of fluid and stress in biological IVDs. (C) 2005 Springer Science + Business Media, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Serotonin (5-hydroxytryptamine, 5-HT) is an amine neurotransmitter derived from tryptophan and is important in brain systems regulating mood, emotional behavior, and sleep. Selective serotonin reuptake inhibitor (SSRI) drugs are used to treat disorders such as depression, stress, eating disorders, autism, and schizophrenia. It is thought that these drugs act to prolong the action of 5-HT by blocking reuptake. This may lead to decreased 5-HT content in the nerve fibers themselves; however, this has not previously been directly demonstrated. We have studied the effects of administration of two drugs, imipramine and citalopram, on levels of 5-HT in nerve fibers in the murine brain. Quantitative analysis of the areal density of 5-HT fibers throughout the brain was performed using ImageJ software. While a high density of fibers was observed in mid- and hind-brain regions and areas such as thalamus and hypothalamus, densities were far lower in areas such as cortex, where SSRIs might be thought to exert their actions. As anticipated, imipramine and citalopram produced a decline in 5-HT levels in nerve fibers, but the result was not uniform. Areas such as inferior colliculus showed significant reduction whereas little, if any, change was observed in the adjacent superior colliculus. The reason for, and significance of, this regionality is unclear. It has been proposed that serotonin effects in the brain might be linked to changes in glutamatergic transmission. Extracellular glutamate levels are regulated primarily by glial glutamate transporters. Qualitative evaluation of glutamate transporter immunolabeling in cortex of control and drug-treated mice revealed no discernable difference in intensity of glutamate transporter immunoreactivity. These data suggest that changes in intracellular and extracellular levels of serotonin do not cause concomitant changes in astroglial glutamate transporter expression, and thus cannot represent a mechanism for the delayed efficacy of antidepressants when administered clinically. © 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this report is to describe the use of WinBUGS for two datasets that arise from typical population pharmacokinetic studies. The first dataset relates to gentamicin concentration-time data that arose as part of routine clinical care of 55 neonates. The second dataset incorporated data from 96 patients receiving enoxaparin. Both datasets were originally analyzed by using NONMEM. In the first instance, although NONMEM provided reasonable estimates of the fixed effects parameters it was unable to provide satisfactory estimates of the between-subject variance. In the second instance, the use of NONMEM resulted in the development of a successful model, albeit with limited available information on the between-subject variability of the pharmacokinetic parameters. WinBUGS was used to develop a model for both of these datasets. Model comparison for the enoxaparin dataset was performed by using the posterior distribution of the log-likelihood and a posterior predictive check. The use of WinBUGS supported the same structural models tried in NONMEM. For the gentamicin dataset a one-compartment model with intravenous infusion was developed, and the population parameters including the full between-subject variance-covariance matrix were available. Analysis of the enoxaparin dataset supported a two compartment model as superior to the one-compartment model, based on the posterior predictive check. Again, the full between-subject variance-covariance matrix parameters were available. Fully Bayesian approaches using MCMC methods, via WinBUGS, can offer added value for analysis of population pharmacokinetic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.