827 resultados para SOA-based model
Resumo:
Usually we observe that Bio-physical systems or Bio-chemical systems are many a time based on nanoscale phenomenon in different host environments, which involve many particles can often not be solved explicitly. Instead a physicist, biologist or a chemist has to rely either on approximate or numerical methods. For a certain type of systems, called integrable in nature, there exist particular mathematical structures and symmetries which facilitate the exact and explicit description. Most integrable systems, we come across are low-dimensional, for instance, a one-dimensional chain of coupled atoms in DNA molecular system with a particular direction or exist as a vector in the environment. This theoretical research paper aims at bringing one of the pioneering ‘Reaction-Diffusion’ aspects of the DNA-plasma material system based on an integrable lattice model approach utilizing quantized functional algebras, to disseminate the new developments, initiate novel computational and design paradigms.
Resumo:
We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.
Resumo:
Objective: To determine current food handling practices, knowledge and beliefs of primary food handlers with children 10 years old and the relationship between these components. Design: Surveys were developed based on FightBac!™ concepts and the Health Belief Model (HBM) construct. Participants: The majority of participants (n= 503) were females (67%), Caucasians (80%), aged between 30 to 49 years old (83%), had one or two children (83%), prepared meals all or most of the time (76%) and consumed meals away from home three times or less per week (66%). Analysis: Descriptive statistics and inferential statistics using Spearman’s rank correlation coefficient (rho) (p<0.05 and one-tail) and Chi-square were used to examine frequency and correlations. Results: Few participants reached the food safety objectives of Healthy People 2010 for safe food handling practices (79%). Mixed results were reported for perceived susceptibility. Only half of the participants (53-54%) reported high perceived severity for their children if they contracted food borne illness. Most participants were confident of their food handling practices for their children (91%) and would change their food handling practices if they or their family members previously experienced food poisoning (79%). Participants’ reasons for high self-efficacy were learning from their family and independently acquiring knowledge and skills from the media, internet or job. The three main barriers to safe food handling were insufficient time, lots of distractions and lack of control of the food handling practices of other people in the household. Participants preferred to use food safety information that is easy to understand, has scientific facts, causes feelings of health-threat and has lots of pictures or visuals. Participants demonstrate high levels of knowledge in certain areas of the FightBac!TM concepts but lacked knowledge in other areas. Knowledge and cues to action were most supportive of the HBM construct, while perceived susceptibility was least supportive of the HBM construct. Conclusion: Most participants demonstrate many areas to improve in their food handling practices, knowledge and beliefs. Adviser: Julie A. Albrecht
Resumo:
This paper proposes a drain current model for triple-gate n-type junctionless nanowire transistors. The model is based on the solution of the Poisson equation. First, the 2-D Poisson equation is used to obtain the effective surface potential for long-channel devices, which is used to calculate the charge density along the channel and the drain current. The solution of the 3-D Laplace equation is added to the 2-D model in order to account for the short-channel effects. The proposed model is validated using 3-D TCAD simulations where the drain current and its derivatives, the potential, and the charge density have been compared, showing a good agreement for all parameters. Experimental data of short- channel devices down to 30 nm at different temperatures have been also used to validate the model.
Resumo:
This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.
Resumo:
In this paper, we discuss the effects of catalyst load with respect to carbon powder for several Pt and Pb-based catalysts, using formic acid as a model molecule. The discussion is based on electrochemical tests, a complete morphological investigation and theoretical calculations. We show that the Pt and Pb-based catalysts presented activity in formic acid oxidation at very low catalyst loads (e.g., 0.5% in respect to the carbon content). Physical characterisations demonstrate that the electrodes are composed of separated phases of Pt and lead distributed in Pt nanometric-sized islands that are heterogeneously dispersed on the carbon support and Pb ultra-small particles homogeneously distributed throughout the entire carbon surface, as demonstrated by the microscopy studies. At high catalyst loads, very large clusters of Pb(x)O(y) could be observed. Electrochemical tests indicated an increase in the apparent resistance of the system (by a factor of 19.7 Omega) when the catalyst load was increased. The effect of lead in the materials was also studied by theoretical calculations (OFT). The main conclusion is that the presence of Pb atoms in the catalyst can improve the adsorption of formic acid in the catalytic system compared with a pure Pt-based catalyst. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Several pharmacological targets have been proposed as modulators of panic-like reactions. However, interest should be given to other potential therapeutic neurochemical agents. Recent attention has been given to the potential anxiolytic properties of cannabidiol, because of its complex actions on the endocannabinoid system together with its effects on other neurotransmitter systems. The aim of this study was to investigate the effects of cannabidiol on innate fear-related behaviors evoked by a prey vs predator paradigm. Male Swiss mice were submitted to habituation in an arena containing a burrow and subsequently pre-treated with intraperitoneal administrations of vehicle or cannabidiol. A constrictor snake was placed inside the arena, and defensive and non-defensive behaviors were recorded. Cannabidiol caused a clear anti-aversive effect, decreasing explosive escape and defensive immobility behaviors outside and inside the burrow. These results show that cannabidiol modulates defensive behaviors evoked by the presence of threatening stimuli, even in a potentially safe environment following a fear response, suggesting a panicolytic effect. Neuropsychopharmacology (2012) 37, 412-421; doi:10.1038/npp.2011.188; published online 14 September 2011
Resumo:
This paper provides additional validation to the problem of estimating wave spectra based on the first-order motions of a moored vessel. Prior investigations conducted by the authors have attested that even a large-volume ship, such as an FPSO unit, could be adopted for on-board estimation of the wave field. The obvious limitation of the methodology concerns filtering of high-frequency wave components, for which the vessel has no significant response. As a result, the estimation range is directly dependent on the characteristics of the vessel response. In order to extend this analysis, further small-scale tests were performed with a model of a pipe-laying crane-barge. When compared to the FPSO case, the results attest that a broader range of typical sea states can be accurately estimated, including crossed-sea states with low peak periods. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
A neural network model to predict ozone concentration in the Sao Paulo Metropolitan Area was developed, based on average values of meteorological variables in the morning (8:00-12:00 hr) and afternoon (13:00-17: 00 hr) periods. Outputs are the maximum and average ozone concentrations in the afternoon (12:00-17:00 hr). The correlation coefficient between computed and measured values was 0.82 and 0.88 for the maximum and average ozone concentration, respectively. The model presented good performance as a prediction tool for the maximum ozone concentration. For prediction periods from 1 to 5 days 0 to 23% failures (95% confidence) were obtained.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Trabajo realizado por Antonio Machado Carrillo, Juan Antonio Bermejo e Ignacio Lorenzo
Resumo:
This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).
Resumo:
We study some perturbative and nonperturbative effects in the framework of the Standard Model of particle physics. In particular we consider the time dependence of the Higgs vacuum expectation value given by the dynamics of the StandardModel and study the non-adiabatic production of both bosons and fermions, which is intrinsically non-perturbative. In theHartree approximation, we analyze the general expressions that describe the dissipative dynamics due to the backreaction of the produced particles. Then, we solve numerically some relevant cases for the Standard Model phenomenology in the regime of relatively small oscillations of the Higgs vacuum expectation value (vev). As perturbative effects, we consider the leading logarithmic resummation in small Bjorken x QCD, concentrating ourselves on the Nc dependence of the Green functions associated to reggeized gluons. Here the eigenvalues of the BKP kernel for states of more than three reggeized gluons are unknown in general, contrary to the large Nc limit (planar limit) case where the problem becomes integrable. In this contest we consider a 4-gluon kernel for a finite number of colors and define some simple toy models for the configuration space dynamics, which are directly solvable with group theoretical methods. In particular we study the depencence of the spectrum of thesemodelswith respect to the number of colors andmake comparisons with the planar limit case. In the final part we move on the study of theories beyond the Standard Model, considering models built on AdS5 S5/Γ orbifold compactifications of the type IIB superstring, where Γ is the abelian group Zn. We present an appealing three family N = 0 SUSY model with n = 7 for the order of the orbifolding group. This result in a modified Pati–Salam Model which reduced to the StandardModel after symmetry breaking and has interesting phenomenological consequences for LHC.
Resumo:
[EN]Ensemble forecasting is a methodology to deal with uncertainties in the numerical wind prediction. In this work we propose to apply ensemble methods to the adaptive wind forecasting model presented in. The wind field forecasting is based on a mass-consistent model and a log-linear wind profile using as input data the resulting forecast wind from Harmonie, a Non-Hydrostatic Dynamic model used experimentally at AEMET with promising results. The mass-consistent model parameters are estimated by using genetic algorithms. The mesh is generated using the meccano method and adapted to the geometry…