997 resultados para Sub-seafloor modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water use efficiency (WUE) is considered as a determinant of yield under stress and a component of crop drought resistance. Stomatal behavior regulates both transpiration rate and net assimilation and has been suggested to be crucial for improving crop WUE. In this work, a dynamic model was used to examine the impact of dynamic properties of stomata on WUE. The model includes sub-models of stomatal conductance dynamics, solute accumulation in the mesophyll, mesophyll water content, and water flow to the mesophyll. Using the instantaneous value of stomatal conductance, photosynthesis, and transpiration rate were simulated using a biochemical model and Penman-Monteith equation, respectively. The model was parameterized for a cucumber leaf and model outputs were evaluated using climatic data. Our simulations revealed that WUE was higher on a cloudy than a sunny day. Fast stomatal reaction to light decreased WUE during the period of increasing light (e.g., in the morning) by up to 10.2% and increased WUE during the period of decreasing light (afternoon) by up to 6.25%. Sensitivity of daily WUE to stomatal parameters and mesophyll conductance to CO2 was tested for sunny and cloudy days. Increasing mesophyll conductance to CO2 was more likely to increase WUE for all climatic conditions (up to 5.5% on the sunny day) than modifications of stomatal reaction speed to light and maximum stomatal conductance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Direito, 2016.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Is fairness in process and outcome a generalizable driver of police legitimacy? In many industrialized nations, studies have demonstrated that police legitimacy is largely a function of whether citizens perceive treatment as normatively fair and respectful. Questions remain whether this model holds in less-industrialized contexts, where corruption and security challenges favor instrumental preferences for effective crime control and prevention. Support for and against the normative model of legitimacy has been found in less-industrialized countries, yet few have simultaneously compared these models across multiple industrializing countries. Using a multilevel framework and data from respondents in 27 countries in sub-Saharan Africa (n~43,000), I find evidence for the presence of both instrumental and normative influences in shaping the perceptions of police legitimacy. More importantly, the internal consistency of legitimacy (defined as obligation to obey, moral alignment, and perceived legality of the police) varies considerably from country to country, suggesting that relationships between legality, morality, and obligation operate differently across contexts. Results are robust to a number of different modeling assumptions and alternative explanations. Overall, the results indicate that both fairness and effectiveness matter, not in all places, and in some cases contrary to theoretical expectations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.

To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.

To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is focused on improving the calibration accuracy of sub-millimeter astronomical observations. The wavelength range covered by observational radio astronomy has been extended to sub-millimeter and far infrared with the advancement of receiver technology in recent years. Sub-millimeter observations carried out with airborne and ground-based telescopes typically suffer from 10% to 90% attenuation of the astronomical source signals by the terrestrial atmosphere. The amount of attenuation can be derived from the measured brightness of the atmospheric emission. In order to do this, the knowledge of the atmospheric temperature and chemical composition, as well as the frequency-dependent optical depth at each place along the line of sight is required. The altitude-dependent air temperature and composition are estimated using a parametrized static atmospheric model, which is described in Chapter 2, because direct measurements are technically and financially infeasible. The frequency dependent optical depth of the atmosphere is computed with a radiative transfer model based on the theories of quantum mechanics and, in addition, some empirical formulae. The choice, application, and improvement of third party radiative transfer models are discussed in Chapter 3. The application of the calibration procedure, which is described in Chapter 4, to the astronomical data observed with the SubMillimeter Array Receiver for Two Frequencies (SMART), and the German REceiver for Astronomy at Terahertz Frequencies (GREAT), is presented in Chapters 5 and 6. The brightnesses of atmospheric emission were fitted consistently to the simultaneous multi-band observation data from GREAT at 1.2 ∼ 1.4 and 1.8 ∼ 1.9 THz with a single set of parameters of the static atmospheric model. On the other hand, the cause of the inconsistency between the model parameters fitted from the 490 and 810 GHz data of SMART is found to be the lack of calibration of the effective cold load temperature. Besides the correctness of atmospheric modeling, the stability of the receiver is also important to achieving optimal calibration accuracy. The stabilities of SMART and GREAT are analyzed with a special calibration procedure, namely the “load calibration". The effects of the drift and fluctuation of the receiver gain and noise temperature on calibration accuracy are discussed in Chapters 5 and 6. Alternative observing strategies are proposed to combat receiver instability. The methods and conclusions presented in this thesis are applicable to the atmospheric calibration of sub-millimeter astronomical observations up to at least 4.7 THz (the H channel frequency of GREAT) for observations carried out from ∼ 4 to 14 km altitude. The procedures for receiver gain calibration and stability test are applicable to other instruments using the same calibration approach as that for SMART and GREAT. The structure of the high performance, modular, and extensible calibration program used and further developed for this thesis work is presented in the Appendix C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Migratory bird species breeding in the Palearctic and overwintering in sub-Saharan Africa face multiple conservation challenges. As a result, many of these species have declined in recent decades, some dramatically. We therefore used the best available database for the distribution of 68 passerine migrants in sub-Saharan Africa to determine priority regions for their conservation. After modeling each species’ distribution using BIOMOD software, we entered the resulting species distributions at a 1° × 1° grid resolution into MARXAN software. We then used several different selection procedures that varied the boundary length modifier, species penalty factor, and the inclusion of grid cells with high human footprint and with protected areas. While results differed between selection procedures, four main regions were regularly selected: (1) one centered on southern Mali; (2) one including Eritrea, central Sudan, and northern Ethiopia; (3) one encompassing southwestern Kenya and much of Tanzania and Uganda; and (4) one including much of Zimbabwe and southwestern Zambia. We recommend that these four regions become priority regions for research and conservation efforts for the bird species considered in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present Dissertation shows how recent statistical analysis tools and open datasets can be exploited to improve modelling accuracy in two distinct yet interconnected domains of flood hazard (FH) assessment. In the first Part, unsupervised artificial neural networks are employed as regional models for sub-daily rainfall extremes. The models aim to learn a robust relation to estimate locally the parameters of Gumbel distributions of extreme rainfall depths for any sub-daily duration (1-24h). The predictions depend on twenty morphoclimatic descriptors. A large study area in north-central Italy is adopted, where 2238 annual maximum series are available. Validation is performed over an independent set of 100 gauges. Our results show that multivariate ANNs may remarkably improve the estimation of percentiles relative to the benchmark approach from the literature, where Gumbel parameters depend on mean annual precipitation. Finally, we show that the very nature of the proposed ANN models makes them suitable for interpolating predicted sub-daily rainfall quantiles across space and time-aggregation intervals. In the second Part, decision trees are used to combine a selected blend of input geomorphic descriptors for predicting FH. Relative to existing DEM-based approaches, this method is innovative, as it relies on the combination of three characteristics: (1) simple multivariate models, (2) a set of exclusively DEM-based descriptors as input, and (3) an existing FH map as reference information. First, the methods are applied to northern Italy, represented with the MERIT DEM (∼90m resolution), and second, to the whole of Italy, represented with the EU-DEM (25m resolution). The results show that multivariate approaches may (a) significantly enhance flood-prone areas delineation relative to a selected univariate one, (b) provide accurate predictions of expected inundation depths, (c) produce encouraging results in extrapolation, (d) complete the information of imperfect reference maps, and (e) conveniently convert binary maps into continuous representation of FH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

American tegumentary leishmaniasis (ATL) is a disease transmitted to humans by the female sandflies of the genus Lutzomyia. Several factors are involved in the disease transmission cycle. In this work only rainfall and deforestation were considered to assess the variability in the incidence of ATL. In order to reach this goal, monthly recorded data of the incidence of ATL in Orán, Salta, Argentina, were used, in the period 1985-2007. The square root of the relative incidence of ATL and the corresponding variance were formulated as time series, and these data were smoothed by moving averages of 12 and 24 months, respectively. The same procedure was applied to the rainfall data. Typical months, which are April, August, and December, were found and allowed us to describe the dynamical behavior of ATL outbreaks. These results were tested at 95% confidence level. We concluded that the variability of rainfall would not be enough to justify the epidemic outbreaks of ATL in the period 1997-2000, but it consistently explains the situation observed in the years 2002 and 2004. Deforestation activities occurred in this region could explain epidemic peaks observed in both years and also during the entire time of observation except in 2005-2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, all publicly-accessible published findings on Alicyclobacillus acidoterrestris heat resistance in fruit beverages as affected by temperature and pH were compiled. Then, study characteristics (protocols, fruit and variety, °Brix, pH, temperature, heating medium, culture medium, inactivation method, strains, etc.) were extracted from the primary studies, and some of them incorporated to a meta-analysis mixed-effects linear model based on the basic Bigelow equation describing the heat resistance parameters of this bacterium. The model estimated mean D* values (time needed for one log reduction at a temperature of 95 °C and a pH of 3.5) of Alicyclobacillus in beverages of different fruits, two different concentration types, with and without bacteriocins, and with and without clarification. The zT (temperature change needed to cause one log reduction in D-values) estimated by the meta-analysis model were compared to those ('observed' zT values) reported in the primary studies, and in all cases they were within the confidence intervals of the model. The model was capable of predicting the heat resistance parameters of Alicyclobacillus in fruit beverages beyond the types available in the meta-analytical data. It is expected that the compilation of the thermal resistance of Alicyclobacillus in fruit beverages, carried out in this study, will be of utility to food quality managers in the determination or validation of the lethality of their current heat treatment processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The caffeine solubility in supercritical CO2 was studied by assessing the effects of pressure and temperature on the extraction of green coffee oil (GCO). The Peng-Robinson¹ equation of state was used to correlate the solubility of caffeine with a thermodynamic model and two mixing rules were evaluated: the classical mixing rule of van der Waals with two adjustable parameters (PR-VDW) and a density dependent one, proposed by Mohamed and Holder² with two (PR-MH, two parameters adjusted to the attractive term) and three (PR-MH3 two parameters adjusted to the attractive and one to the repulsive term) adjustable parameters. The best results were obtained with the mixing rule of Mohamed and Holder² with three parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Handball is a sport that demands endurance associated with fast and powerful actions such as jumps, blocks, sprints and throws. The aim of this study was to evaluate the effects of a 38-week systematic physical training applied to a women's under 21 handball team on upper and lower limb power, 30m sprints speed and endurance. The periodization applied was an adaptation of the Verkhoshansky theory, and aimed at two performance peaks during the season with six data collections. The median and range values for three kg medicine ball throwing was: 2.98m (2.15-3.50); 2.84m (2.43-3.20); 2.90m (2.60-3.38); 3.10 (2.83-3.81); 2.84 (2.55-3.57) and 3.34 (2.93-3.83). Regarding the three-pass running test: 5.60m (4.93-6.58); 5.37m (5.04-6.38); 5.36m (4.93-6.12); 5.65m (4.80-6.78); 5.63m (5.00-6.40) and 5.83m (5.14-6.05). Regarding the 30-m sprint test: 5.8m/s (5.45-6.44); 6,64 m/s (6,24-7,09); 5.65m/s (5.17-5.95); (there was not IV moment for this test); 6.19 m/s (5.57-6.26) and 5.83 (5.14-6.05).Regarding the 30-m sprint endurance test until 10% decrease: 4 sprints (4-6); 5 sprints (4-9); 4,5 sprints (4-16); (there was not IV moment for this test); 6 sprints (4-12) and 5 sprints (4-5). Significant differences (p<0.05) were observed in three kg medicine ball throwing and three-pass running tests at least in one of the performance peak planned, with no significant differences in 30-m sprint speed or endurance tests. The applied physical training was efficient at improving the specific physical fitness in the performance peaks, as well as giving support for better physical training adjustment for the upcoming season.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física