932 resultados para Dynamic environments
Resumo:
Cameleons are genetically-encoded fluorescent indicators for Ca2+ based on green fluorescent protein variants and calmodulin (CaM). Because cameleons can be targeted genetically and imaged by one- or two-photon excitation microscopy, they offer great promise for monitoring Ca2+ in whole organisms, tissues, organelles, and submicroscopic environments in which measurements were previously impossible. However, the original cameleons suffered from significant pH interference, and their Ca2+-buffering and cross-reactivity with endogenous CaM signaling pathways was uncharacterized. We have now greatly reduced the pH-sensitivity of the cameleons by introducing mutations V68L and Q69K into the acceptor yellow green fluorescent protein. The resulting new cameleons permit Ca2+ measurements despite significant cytosolic acidification. When Ca2+ is elevated, the CaM and CaM-binding peptide fused together in a cameleon predominantly interact with each other rather than with free CaM and CaM-dependent enzymes. Therefore, if cameleons are overexpressed, the primary effect is likely to be the unavoidable increase in Ca2+ buffering rather than specific perturbation of CaM-dependent signaling.
Resumo:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
Resumo:
Virtual learning environments (VLEs) are computer-based online learning environments, which provide opportunities for online learners to learn at the time and location of their choosing, whilst allowing interactions and encounters with other online learners, as well as affording access to a wide range of resources. They have the capability of reaching learners in remote areas around the country or across country boundaries at very low cost. Personalized VLEs are those VLEs that provide a set of personalization functionalities, such as personalizing learning plans, learning materials, tests, and are capable of initializing the interaction with learners by providing advice, necessary instant messages, etc., to online learners. One of the major challenges involved in developing personalized VLEs is to achieve effective personalization functionalities, such as personalized content management, learner model, learner plan and adaptive instant interaction. Autonomous intelligent agents provide an important technology for accomplishing personalization in VLEs. A number of agents work collaboratively to enable personalization by recognizing an individual's eLeaming pace and reacting correspondingly. In this research, a personalization model has been developed that demonstrates dynamic eLearning processes; secondly, this study proposes an architecture for PVLE by using intelligent decision-making agents' autonomous, pre-active and proactive behaviors. A prototype system has been developed to demonstrate the implementation of this architecture. Furthemore, a field experiment has been conducted to investigate the performance of the prototype by comparing PVLE eLearning effectiveness with a non-personalized VLE. Data regarding participants' final exam scores were collected and analyzed. The results indicate that intelligent agent technology can be employed to achieve personalization in VLEs, and as a consequence to improve eLeaming effectiveness dramatically.
Resumo:
This work deals with the random free vibration of functionally graded laminates with general boundary conditions and subjected to a temperature change, taking into account the randomness in a number of independent input variables such as Young's modulus, Poisson's ratio and thermal expansion coefficient of each constituent material. Based on third-order shear deformation theory, the mixed-type formulation and a semi-analytical approach are employed to derive the standard eigenvalue problem in terms of deflection, mid-plane rotations and stress function. A mean-centered first-order perturbation technique is adopted to obtain the second-order statistics of vibration frequencies. A detailed parametric study is conducted, and extensive numerical results are presented in both tabular and graphical forms for laminated plates that contain functionally graded material which is made of aluminum and zirconia, showing the effects of scattering in thermo-clastic material constants, temperature change, edge support condition, side-to-thickness ratio, and plate aspect ratio on the stochastic characteristics of natural frequencies. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Business environments have become exceedingly dynamic and competitive in recent times. This dynamism is manifested in the form of changing process requirements and time constraints. Workflow technology is currently one of the most promising fields of research in business process automation. However, workflow systems to date do not provide the flexibility necessary to support the dynamic nature of business processes. In this paper we primarily discuss the issues and challenges related to managing change and time in workflows representing dynamic business processes. We also present an analysis of workflow modifications and provide feasibility considerations for the automation of this process.
Resumo:
PURPOSE. The purpose of this study was to evaluate the potential of the portable Grand Seiko FR-5000 autorefractor to allow objective, continuous, open-field measurement of accommodation and pupil size for the investigation of the visual response to real-world environments and changes in the optical components of the eye. METHODS. The FR-5000 projects a pair of infrared horizontal and vertical lines on either side of fixation, analyzing the separation of the bars in the reflected image. The measurement bars were turned on permanently and the video output of the FR-5000 fed into a PC for real-time analysis. The calibration between infrared bar separation and the refractive error was assessed over a range of 10.0 D with a model eye. Tolerance to longitudinal instrument head shift was investigated over a ±15 mm range and to eye alignment away from the visual axis over eccentricities up to 25.0°. The minimum pupil size for measurement was determined with a model eye. RESULTS. The separation of the measurement bars changed linearly (r = 0.99), allowing continuous online analysis of the refractive state at 60 Hz temporal and approximately 0.01 D system resolution with pupils >2 mm. The pupil edge could be analyzed on the diagonal axes at the same rate with a system resolution of approximately 0.05 mm. The measurement of accommodation and pupil size were affected by eccentricity of viewing and instrument focusing inaccuracies. CONCLUSIONS. The small size of the instrument together with its resolution and temporal properties and ability to measure through a 2 mm pupil make it useful for the measurement of dynamic accommodation and pupil responses in confined environments, although good eye alignment is important. Copyright © 2006 American Academy of Optometry.
Resumo:
The aerobic selective oxidation (selox) of alcohols represents an environmentally benign and atom efficient chemical valorisation route to commercially important allylic aldehydes, such as crotonaldehyde and cinnamaldehyde, which find application in pesticides, fragrances and food additives. Palladium nanoparticles are highly active and selective heterogeneous catalysts for such oxidative dehydrogenations, permitting the use of air (or dioxygen) as a green oxidant in place of stoichiometric chromate permanganate saltsor H2O2. Here we discuss how time-resolved, in-situ X-ray spectroscopies (XAS and XPS) reveal dynamic restructuring of dispersed Pd nanoparticles and Pd single-crystals in response to changing reaction environments, and thereby identify surface PdO as the active species responsible for palladium catalysed crotyl alcohol selox (Figure 1); on-stream reduction to palladium metal under oxygen-poor regimes thus appears the primary cause of catalyst deactivation. This insight has guided the subsequent application of surfactant-templating and inorganic nanocrystal methodologies to optimize the density of desired active PdO sites for the selective oxidation of natural products such as sesquiterpenoids.
Resumo:
Different types of ontologies and knowledge or metaknowledge connected to them are considered and analyzed aiming at realization in contemporary information security systems (ISS) and especially the case of intrusion detection systems (IDS) or intrusion prevention systems (IPS). Human-centered methods INCONSISTENCY, FUNNEL, CALEIDOSCOPE and CROSSWORD are algorithmic or data-driven methods based on ontologies. All of them interact on a competitive principle ‘survival of the fittest’. They are controlled by a Synthetic MetaMethod SMM. It is shown that the data analysis frequently needs an act of creation especially if it is applied to knowledge-poor environments. It is shown that human-centered methods are very suitable for resolutions in case, and often they are based on the usage of dynamic ontologies
Resumo:
Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^
Resumo:
The exploration and development of oil and gas reserves located in harsh offshore environments are characterized with high risk. Some of these reserves would be uneconomical if produced using conventional drilling technology due to increased drilling problems and prolonged non-productive time. Seeking new ways to reduce drilling cost and minimize risks has led to the development of Managed Pressure Drilling techniques. Managed pressure drilling methods address the drawbacks of conventional overbalanced and underbalanced drilling techniques. As managed pressure drilling techniques are evolving, there are many unanswered questions related to safety and operating pressure regimes. Quantitative risk assessment techniques are often used to answer these questions. Quantitative risk assessment is conducted for the various stages of drilling operations – drilling ahead, tripping operation, casing and cementing. A diagnostic model for analyzing the rotating control device, the main component of managed pressure drilling techniques, is also studied. The logic concept of Noisy-OR is explored to capture the unique relationship between casing and cementing operations in leading to well integrity failure as well as its usage to model the critical components of constant bottom-hole pressure drilling technique of managed pressure drilling during tripping operation. Relevant safety functions and inherent safety principles are utilized to improve well integrity operations. Loss function modelling approach to enable dynamic consequence analysis is adopted to study blowout risk for real-time decision making. The aggregation of the blowout loss categories, comprising: production, asset, human health, environmental response and reputation losses leads to risk estimation using dynamically determined probability of occurrence. Lastly, various sub-models developed for the stages/sub-operations of drilling operations and the consequence modelling approach are integrated for a holistic risk analysis of drilling operations.
Resumo:
The study of the Upper Jurassic-Lower Cretaceous deposits (Higueruelas, Villar del Arzobispo and Aldea de Cortés Formations) of the South Iberian Basin (NW Valencia, Spain) reveals new stratigraphic and sedimentological data, which have significant implications on the stratigraphic framework, depositional environments and age of these units. The Higueruelas Fm was deposited in a mid-inner carbonate platform where oncolitic bars migrated by the action of storms and where oncoid production progressively decreased towards the uppermost part of the unit. The overlying Villar del Arzobispo Fm has been traditionally interpreted as an inner platform-lagoon evolving into a tidal-flat. Here it is interpreted as an inner-carbonate platform affected by storms, where oolitic shoals protected a lagoon, which had siliciclastic inputs from the continent. The Aldea de Cortés Fm has been previously interpreted as a lagoon surrounded by tidal-flats and fluvial-deltaic plains. Here it is reinterpreted as a coastal wetland where siliciclastic muddy deposits interacted with shallow fresh to marine water bodies, aeolian dunes and continental siliciclastic inputs. The contact between the Higueruelas and Villar del Arzobispo Fms, classically defined as gradual, is also interpreted here as rapid. More importantly, the contact between the Villar del Arzobispo and Aldea de Cortés Fms, previously considered as unconformable, is here interpreted as gradual. The presence of Alveosepta in the Villar del Arzobispo Fm suggests that at least part of this unit is Kimmeridgian, unlike the previously assigned Late Tithonian-Middle Berriasian age. Consequently, the underlying Higueruelas Fm, previously considered Tithonian, should not be younger than Kimmeridgian. Accordingly, sedimentation of the Aldea de Cortés Fm, previously considered Valangian-Hauterivian, probably started during the Tithonian and it may be considered part of the regressive trend of the Late Jurassic-Early Cretaceous cycle. This is consistent with the dinosaur faunas, typically Jurassic, described in the Villar del Arzobispo and Aldea de Cortés Fms.
Resumo:
In this study, the authors propose simple methods to evaluate the achievable rates and outage probability of a cognitive radio (CR) link that takes into account the imperfectness of spectrum sensing. In the considered system, the CR transmitter and receiver correlatively sense and dynamically exploit the spectrum pool via dynamic frequency hopping. Under imperfect spectrum sensing, false-alarm and miss-detection occur which cause impulsive interference emerged from collisions due to the simultaneous spectrum access of primary and cognitive users. That makes it very challenging to evaluate the achievable rates. By first examining the static link where the channel is assumed to be constant over time, they show that the achievable rate using a Gaussian input can be calculated accurately through a simple series representation. In the second part of this study, they extend the calculation of the achievable rate to wireless fading environments. To take into account the effect of fading, they introduce a piece-wise linear curve fitting-based method to approximate the instantaneous achievable rate curve as a combination of linear segments. It is then demonstrated that the ergodic achievable rate in fast fading and the outage probability in slow fading can be calculated to achieve any given accuracy level.
Resumo:
Hydrothermal sulfide chimneys located along the global system of oceanic spreading centers are habitats for microbial life during active venting. Hydrothermally extinct, or inactive, sulfide deposits also host microbial communities at globally distributed sites. The main goal of this study is to describe Fe transformation pathways, through precipitation and oxidation-reduction (redox) reactions, and examine transformation products for signatures of biological activity using Fe mineralogy and stable isotope approaches. The study includes active and inactive sulfides from the East Pacific Rise 9 degrees 50'N vent field. First, the mineralogy of Fe(III)-bearing precipitates is investigated using microprobe X-ray absorption spectroscopy (RXAS) and X-ray diffraction (mu XRD). Second, laser-ablation (LA) and micro-drilling (MD) are used to obtain spatially-resolved Fe stable isotope analysis by multicollector-inductively coupled plasma-mass spectrometry (MC-ICP-MS). Eight Fe -bearing minerals representing three mineralogical classes are present in the samples: oxyhydroxides, secondary phyllosilicates, and sulfides. For Fe oxyhydroxides within chimney walls and layers of Si-rich material, enrichments in both heavy and light Fe isotopes relative to pyrite are observed, yielding a range of delta Fe-57 values up to 6 parts per thousand. Overall, several pathways for Fe transformation are observed. Pathway 1 is characterized by precipitation of primary sulfide minerals from Fe(II)aq-rich fluids in zones of mixing between vent fluids and seawater. Pathway 2 is also consistent with zones of mixing but involves precipitation of sulfide minerals from Fe(II)aq generated by Fe(III) reduction. Pathway 3 is direct oxidation of Fe(II) aq from hydrothermal fluids to form Fe(III) precipitates. Finally, Pathway 4 involves oxidative alteration of pre-existing sulfide minerals to form Fe(III). The Fe mineralogy and isotope data do not support or refute a unique biological role in sulfide alteration. The findings reveal a dynamic range of Fe transformation pathways consistent with a continuum of micro-environments having variable redox conditions. These micro-environments likely support redox cycling of Fe and S and are consistent with culture-dependent and -independent assessments of microbial physiology and genetic diversity of hydrothermal sulfide deposits.
Resumo:
Aim: Rather than being rigid, habitual behaviours may be determined by dynamic mental representations that can adapt to context changes. This adaptive potential may result from particular conditions dependent on the interaction between two sources of mental constructs activation: perceived context applicability and cognitive accessibility . Method: T wo web-shopping simulations of fering the choice between habitually chosen and non-habitually chosen food products were presented to participants. This considered two choice contexts dif fering in the habitual behaviour perceived applicability (low vs. high) and a measure of habitual behaviour chronicity . Results: Study 1 demonstrated a perceived applicability ef fect, with more habitual (non-organic) than non-habitual (organic) food products chosen in a high perceived applicability (familiar) than in a low perceived applicability (new) context. The adaptive potential of habitual behaviour was evident in the habitual products choice consistency across three successive choices, despite the decrease in perceived applicability . Study 2 evidenced the adaptive potential in strong habitual behaviour participants – high chronic accessibility – who chose a habitual product (milk) more than a non-habitual product (orange juice), even when perceived applicability was reduced (new context). Conclusion: Results portray consumers as adaptive decision makers that can flexibly cope with changes in their (inner and outer) choice contexts.
Resumo:
Part 17: Risk Analysis