884 resultados para Chemical etching method combining static etching and dynamic etching


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple but efficient voice activity detector based on the Hilbert transform and a dynamic threshold is presented to be used on the pre-processing of audio signals -- The algorithm to define the dynamic threshold is a modification of a convex combination found in literature -- This scheme allows the detection of prosodic and silence segments on a speech in presence of non-ideal conditions like a spectral overlapped noise -- The present work shows preliminary results over a database built with some political speech -- The tests were performed adding artificial noise to natural noises over the audio signals, and some algorithms are compared -- Results will be extrapolated to the field of adaptive filtering on monophonic signals and the analysis of speech pathologies on futures works

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, research in Computer Vision has developed several algorithms to help botanists and non-experts to classify plants based on images of their leaves. LeafSnap is a mobile application that uses a multiscale curvature model of the leaf margin to classify leaf images into species. It has achieved high levels of accuracy on 184 tree species from Northeast US. We extend the research that led to the development of LeafSnap along two lines. First, LeafSnap’s underlying algorithms are applied to a set of 66 tree species from Costa Rica. Then, texture is used as an additional criterion to measure the level of improvement achieved in the automatic identification of Costa Rica tree species. A 25.6% improvement was achieved for a Costa Rican clean image dataset and 42.5% for a Costa Rican noisy image dataset. In both cases, our results show this increment as statistically significant. Further statistical analysis of visual noise impact, best algorithm combinations per species, and best value of k , the minimal cardinality of the set of candidate species that the tested algorithms render as best matches is also presented in this research

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model Driven based approach for Service Evolution in Clouds will mainly focus on the reusable evolution patterns' advantage to solve evolution problems. During the process, evolution pattern will be driven by MDA models to pattern aspects. Weaving the aspects into service based process by using Aspect-Oriented extended BPEL engine at runtime will be the dynamic feature of the evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid climatic changes are taking place in Arctic, subarctic and cold temperate regions, where predictions point to an increase in freeze-thaw events, changes in precipitation, evaporation and salinity patterns. Climate change may therefore result in large impacts in ecosystem functioning and dynamics, especially in the presence of contaminants due to intense anthropogenic activities. Even though multiple stress approaches have received increasing interest in the last decades, the number of such studies is limited. In particular, knowledge on the effect of freezethaw events and salinity fluctuations on ecotoxicology of soil invertebrates is lacking, especially important when considering supralittoral species. Therefore, the aim of this thesis was to investigate the effects of low temperature and salinity fluctuations, singly and in combination with contaminants, in the freeze-tolerant and euryhaline enchytraeid Enchytraeus albidus. The assessment of population level endpoints (survival and reproduction), along with physiological and biochemical parameters such as levels of cryoprotectants, ice/water content, oxidative stress biomarkers, cellular energy allocation, and tissue concentration of chemicals (when applied), provided new and valuable knowledge on the effects of selected physical and chemical stressors in E. albidus, and allowed the understanding of adjustments in the primary response mechanisms that enable worms to maintain homeostasis and survival in harsh environments such as polar and temperate-cold regions. The presence of moderate levels of salinity significantly increased freeze-tolerance (mainly evaluated as survival, cryoprotection and ice fraction) and reproduction of E. albidus. Moreover, it contributed to the readjustments of cryoprotectant levels, restoration of antioxidant levels and changed singnificantly the effect and uptake of chemicals (copper cadmium, carbendazim and 4-nonylphenol). Temperature fluctuations (simulated as daily freeze-thaw cycles, between -2ºC and -4ºC) caused substancial negative effect on survival of worms previsouly exposed to non-lethal concentrations of 4-nonylphenol, as compared with constant freezing (-4ºC) and control temperature (2ºC). The decrease in cryoprotectants, increase in energy consumption and the highest concentration of 4-nonylphenol in the tissues have highlighted the high energy requirements and level of toxicity experienced by worms exposed to the combined effect of contaminants and freezing-thawing events. The findings reported on this thesis demonstrate that natural (physical) and chemical stressors, singly or in combination, may alter the dynamics of E. albidus, affecting not only their survival and reproduction (and consequent presence/distribution) but also their physiological and biochemical adaptations. These alterations may lead to severe consequences for the functioning of the ecosystems along the Arctic, subarctic and cold temperate regions, where they play an important role for decomposition of dead organic matter. This thesis provides a scientific basis for improving the setting of safety factors for natural soil ecosystems, and to underline the integration of similar investigations in ecotoxicology, and eventually in risk assessment of contaminants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Why are some companies more successful than others? This thesis approaches the question by enlisting theoretical frameworks that explain the performance with internal factors, deriving from the resource-based view, namely the dynamic capabilities approach. To deepen the understanding of the drivers and barriers towards developing these higher order routines aiming at improving the operational level routines, this thesis explores the organisational culture and identity research for the microfoundational antecedents that might shed light on the formation of the dynamic capabilities. The dynamic capabilities framework in this thesis strives to take the theoretical concept closer to practical applicability. This is achieved through creation of a dynamic capabilities matrix, consisting of four dimensions often encountered in dynamic capabilities literature. The quadrants are formed along internal-external and resources-abilities axes, and consist of Sensing, Learning, Reconfiguration and Partnering facets. A key element of this thesis is the reality continuum, which illustrates the different levels of reality inherent in any entity of human individuals. The theoretical framework constructed in the thesis suggests a link between the collective but constructivist understanding of the organisation and both the operational and higher level routines, evident in the more positivist realm. The findings from three different case organisations suggest that the constructivist assumptions inherent to an organisation function as a generative base for both drivers and barriers towards developing dynamic capabilities. From each organisation one core assumption is scrutinized to identify its connections to the four dimensions of the dynamic capabilities. These connections take the form of drivers or barriers – or have the possibility to develop into one or the other. The main contribution of this thesis is to show that one key for an organisation to perform well in a turbulent setting, is to understand the different levels of realities inherent in any group of people. Recognising the intangible levels gives an advantage in the tangible ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE We aimed to evaluate the added value of diffusion-weighted imaging (DWI) to standard magnetic resonance imaging (MRI) for detecting post-treatment cervical cancer recurrence. The detection accuracy of T2-weighted (T2W) images was compared with that of T2W MRI combined with either dynamic contrast-enhanced (DCE) MRI or DWI. METHODS Thirty-eight women with clinically suspected uterine cervical cancer recurrence more than six months after treatment completion were examined with 1.5 Tesla MRI including T2W, DCE, and DWI sequences. Disease was confirmed histologically and correlated with MRI findings. The diagnostic performance of T2W imaging and its combination with either DCE or DWI were analyzed. Sensitivity, positive predictive value, and accuracy were calculated. RESULTS Thirty-six women had histologically proven recurrence. The accuracy for recurrence detection was 80% with T2W/DCE MRI and 92.1% with T2W/DWI. The addition of DCE sequences did not significantly improve the diagnostic ability of T2W imaging, and this sequence combination misclassified two patients as falsely positive and seven as falsely negative. The T2W/DWI combination revealed a positive predictive value of 100% and only three false negatives. CONCLUSION The addition of DWI to T2W sequences considerably improved the diagnostic ability of MRI. Our results support the inclusion of DWI in the initial MRI protocol for the detection of cervical cancer recurrence, leaving DCE sequences as an option for uncertain cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to test the ability of some correlative models such as Alpert correlations on 1972 and re-examined on 2011, the investigation of Heskestad and Delichatsios in 1978, the correlations produced by Cooper in 1982, to define both dynamic and thermal characteristics of a fire induced ceiling-jet flow. The flow occurs when the fire plume impinges the ceiling and develops in the radial direction of the fire axis. Both temperature and velocity predictions are decisive for sprinklers positioning, fire alarms positions, detectors (heat, smoke) positions and activation times and back-layering predictions. These correlative models will be compared with a 3D numerical simulation software CFAST. For the results comparison of temperature and velocity near the ceiling. These results are also compared with a Computational Fluid Dynamics (CFD) analysis, using ANSYS FLUENT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 19: Knowledge Management in Networks

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.

To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.

To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural radioactive tracer-based assessments of basin-scale submarine groundwater discharge (SGD) are well developed. However, SGD takes place in different modes and the flow and discharge mechanisms involved occur over a wide range of spatial and temporal scales. Quantifying SGD while discriminating its source functions therefore remains a major challenge. However, correctly identifying both the fluid source and composition is critical. When multiple sources of the tracer of interest are present, failure to adequately discriminate between them leads to inaccurate attribution and the resulting uncertainties will affect the reliability of SGD solute loading estimates. This lack of reliability then extends to the closure of local biogeochemical budgets, confusing measures aiming to mitigate pollution. Here, we report a multi-tracer study to identify the sources of SGD, distinguish its component parts and elucidate the mechanisms of their dispersion throughout the Ria Formosa – a seasonally hypersaline lagoon in Portugal. We combine radon budgets that determine the total SGD (meteoric + recirculated seawater) in the system with stable isotopes in water (δ2H, δ18O), to specifically identify SGD source functions and characterize active hydrological pathways in the catchment. Using this approach, SGD in the Ria Formosa could be separated into two modes, a net meteoric water input and another involving no net water transfer, i.e., originating in lagoon water re-circulated through permeable sediments. The former SGD mode is present occasionally on a multi-annual timescale, while the latter is a dominant feature of the system. In the absence of meteoric SGD inputs, seawater recirculation through beach sediments occurs at a rate of  ∼  1.4  ×  106 m3 day−1. This implies that the entire tidal-averaged volume of the lagoon is filtered through local sandy sediments within 100 days ( ∼  3.5 times a year), driving an estimated nitrogen (N) load of  ∼  350 Ton N yr−1 into the system as NO3−. Land-borne SGD could add a further  ∼  61 Ton N yr−1 to the lagoon. The former source is autochthonous, continuous and responsible for a large fraction (59 %) of the estimated total N inputs into the system via non-point sources, while the latter is an occasional allochthonous source capable of driving new production in the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional decision making research has often focused on one's ability to choose from a set of prefixed options, ignoring the process by which decision makers generate courses of action (i.e., options) in-situ (Klein, 1993). In complex and dynamic domains, this option generation process is particularly critical to understanding how successful decisions are made (Zsambok & Klein, 1997). When generating response options for oneself to pursue (i.e., during the intervention-phase of decision making) previous research has supported quick and intuitive heuristics, such as the Take-The-First heuristic (TTF; Johnson & Raab, 2003). When generating predictive options for others in the environment (i.e., during the assessment-phase of decision making), previous research has supported the situational-model-building process described by Long Term Working Memory theory (LTWM; see Ward, Ericsson, & Williams, 2013). In the first three experiments, the claims of TTF and LTWM are tested during assessment- and intervention-phase tasks in soccer. To test what other environmental constraints may dictate the use of these cognitive mechanisms, the claims of these models are also tested in the presence and absence of time pressure. In addition to understanding the option generation process, it is important that researchers in complex and dynamic domains also develop tools that can be used by `real-world' professionals. For this reason, three more experiments were conducted to evaluate the effectiveness of a new online assessment of perceptual-cognitive skill in soccer. This test differentiated between skill groups and predicted performance on a previously established test and predicted option generation behavior. The test also outperformed domain-general cognitive tests, but not a domain-specific knowledge test when predicting skill group membership. Implications for theory and training, and future directions for the development of applied tools are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We experimentally study the temporal dynamics of amplitude-modulated laser beams propagating through a water dispersion of graphene oxide sheets in a fiber-to-fiber U-bench. Nonlinear refraction induced in the sample by thermal effects leads to both phase reversing of the transmitted signals and dynamic hysteresis in the input- output power curves. A theoretical model including beam propagation and thermal lensing dynamics reproduces the experimental findings. © 2015 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring user interaction activities provides the basis for creating a user model that can be used to predict user behaviour and enable user assistant services. The BaranC framework provides components that perform UI monitoring (and collect all associated context data), builds a user model, and supports services that make use of the user model. In this case study, a Next-App prediction service is built to demonstrate the use of the framework and to evaluate the usefulness of such a prediction service. Next-App analyses a user's data, learns patterns, makes a model for a user, and finally predicts based on the user model and current context, what application(s) the user is likely to want to use. The prediction is pro-active and dynamic; it is dynamic both in responding to the current context, and also in that it responds to changes in the user model, as might occur over time as a user's habits change. Initial evaluation of Next-App indicates a high-level of satisfaction with the service.