981 resultados para Original model
Resumo:
Some properties of the higher grading integrable generalizations of the conformal affine Toda systems are studied. The fields associated to the non-zero grade generators are Dirac spinors. The effective action is written in terms of the Wess-Zumino-Novikov-Witten (WZNW) action associated to an affine Lie algebra, and an off-critical theory is obtained as the result of the spontaneous breakdown of the conformal symmetry. Moreover, the off-critical theory presents a remarkable equivalence between the Noether and topological currents of the model. Related to the off-critical model we define a real and local lagrangian provided some reality conditions are imposed on the fields of the model. This real action model is expected to describe the soliton sector of the original model, and turns out to be the master action from which we uncover the weak-strong phases described by (generalized) massive Thirring and sine-Gordon type models, respectively. The case of any (untwisted) affine Lie algebra furnished with the principal gradation is studied in some detail. The example of s^l(n) (n = 2, 3) is presented explicitly. © SISSA/ISAS 2003.
Resumo:
Today, the trend within the electronics industry is for the use of rapid and advanced simulation methodologies in association with synthesis toolsets. This paper presents an approach developed to support mixed-signal circuit design and analysis. The methodology proposed shows a novel approach to the problem of developing behvioural model descriptions of mixed-signal circuit topologies, by construction of a set of subsystems, that supports the automated mapping of MATLAB®/SIMULINK® models to structural VHDL-AMS descriptions. The tool developed, named MS 2SV, reads a SIMULINK® model file and translates it to a structural VHDL-AMS code. It also creates the file structure required to simulate the translated model in the System Vision™. To validate the methodology and the developed program, the DAC08, AD7524 and AD5450 data converters were studied and initially modelled in MATLAB®/ SIMULINK®. The VHDL-AMS code generated automatically by MS 2SV, (MATLAB®/SIMULINK® to System Vision™), was then simulated in the System Vision™. The simulation results show that the proposed approach, which is based on VHDL-AMS descriptions of the original model library elements, allows for the behavioural level simulation of complex mixed-signal circuits.
Improvement and evaluation of the MS2SV for mixed systems design described in abstraction high level
Resumo:
This paper presents an important improvement of the MS2SV tool. The MS2SV performs the translation of mixed systems developed in MATLAB / Simulink for a structural or behavioral description in VHDL-AMS. Previously, the MS2SV translated only models of the LIB MS2SV library. This improvement allows designer to create your own library to translation. As case study was used a rudder controller employed in an unmanned aerial vehicle. For comparison with the original model the VHDL-AMS code obtained by the translation was simulated in SystemVision environment. The results proved the efficiency of the tool using the translation improvement proposed in this paper.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The aim of this study was to assess the validity and reliability of the Fonseca Anamnestic Index (IAF), used to assess the severity of temporomandibular disorders, applied to Brazilian women. We used a probabilistic sampling design. The participants were 700 women over 18 years of age, living in the city of Araraquara (SP). The IAF questionnaire was applied by telephone interviews. We conducted Confirmatory Factor Analysis (CFA) using Chi-Square Over Degrees of Freedom (χ2/df), Comparative Fit Index (CFI), Tucker-Lewis Index (TLI), and Root Mean Square Error of Approximation (RMSEA) as goodness of fit indices. We calculated the convergent validity, the average variance extracted (AVE) and the composite reliability (CR). Internal consistency was assessed by Cronbach's alpha coefficient (α).The factorial weights of questions 8 and 10 were below the adequate values. Thus, we refined the original model and these questions were excluded. The resulting factorial model showed appropriate goodness of fit to the sample (χ2/df = 3.319, CFI = 0.978, TLI = 0.967, RMSEA = 0.058). The convergent validity (AVE = 0.513, CR = 0.878) and internal consistency (α = 0.745) were adequate. The reduced IAF version showed adequate validity and reliability in a sample of Brazilian women.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Física - FEG
Resumo:
This paper purpose to explore the relationships between supply chain strategies and product performance in retail e-commerce. In this case, we concern that in current, in order to bear up under competition, organizations have to manage their supply chains so that they meet the needs of their final customers. With this concept in mind, the research presented in this study focuses on establishing the right strategy for supply chains according to their product segment. Thus, after a Literature Review, the paper explain a methodology based in different authors studies. Finally the article focuses on a pratical case in e-commerce retail that describes its application in this field. The research shows that it is possible to use a methodology for classifying supply chains using chain strategies and product features. The use of the right strategy for supply chains will improve the competitive advantage of businesses. One limitation is that the methodology study focuses on only two e-commerce segment; future studies may go further in refining the proposed framework for other segments. The aim of this research is to offer businesses a model for evaluating supply chains, allowing them to improve the performance of their products and services by using the right strategy for supply chains. The classification proposal of this paper presents an original model for classification of supply chains based on different studies on the theme.
Resumo:
The maintenance of biodiversity is a long standing puzzle in ecology. It is a classical result that if the interactions of the species in an ecosystem are chosen in a random way, then complex ecosystems can't sustain themselves, meaning that the structure of the interactions between the species must be a central component on the preservation of biodiversity and on the stability of ecosystems. The rock-paper-scissors model is one of the paradigmatic models that study how biodiversity is maintained. In this model 3 species dominate each other in a cyclic way (mimicking a trophic cycle), that is, rock dominates scissors, that dominates paper, that dominates rock. In the original version of this model, this dominance obeys a 'Z IND 3' symmetry, in the sense that the strength of dominance is always the same. In this work, we break this symmetry, studying the effects of the addition of an asymmetry parameter. In the usual model, in a two dimensional lattice, the species distribute themselves according to spiral patterns, that can be explained by the complex Landau-Guinzburg equation. With the addition of asymmetry, new spatial patterns appear during the transient and the system either ends in a state with spirals, similar to the ones of the original model, or in a state where unstable spatial patterns dominate or in a state where only one species survives (and biodiversity is lost).
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Chemotherapeutic SN1‑methylating agents are important anticancer drugs. They induce several covalent modifications in the DNA, from which O6‑methylguanine (O6MeG) is the main toxic lesion. In this work, different hypotheses that have been proposed to explain the mechanism of O6MeG‑triggered cell death were tested. The results of this work support the abortive processing model, which states that abortive post‑replicative processing of O6MeG‑driven mispairs by the DNA mismatch repair (MMR) machinery results in single‑strand gaps in the DNA that, upon a 2nd round of DNA replication, leads to DNA double‑strand break (DSB) formation, checkpoint activation and cell death. In this work, it was shown that O6MeG induces an accumulation of cells in the 2nd G2/M‑phase after treatment. This was accompanied by an increase in DSB formation in the 2nd S/G2/M‑phase, and paralleled by activation of the checkpoint kinases ATR and CHK1. Apoptosis was activated in the 2nd cell cycle. A portion of cells continue proliferating past the 2nd cell cycle, and triggers apoptosis in the subsequent generations. An extension to the original model is proposed, where the persistence of O6MeG in the DNA causes new abortive MMR processing in the 2nd and subsequent generations, where new DSB are produced triggering cell death. Interestingly, removal of O6MeG beyond the 2nd generation lead to a significant, but not complete, reduction in apoptosis, pointing to the involvement of additional mechanisms as a cause of apoptosis. We therefore propose that an increase in genomic instability resulting from accumulation of mis‑repaired DNA damage plays a role in cell death induction. Given the central role of DSB formation in toxicity triggered by chemotherapeutic SN1‑alkylating agents, it was aimed in the second part of this thesis to determine whether inhibition of DSB repair by homologous recombination (HR) or non‑homologous end joining (NHEJ) is a reasonable strategy for sensitizing glioblastoma cells to these agents. The results of this work show that HR down‑regulation in glioblastoma cells impairs the repair of temozolomide (TMZ)‑induced DSB. HR down‑regulation greatly sensitizes cells to cell death following O6‑methylating (TMZ) or O6‑chlorethylating (nimustine) treatment, but not following ionizing radiation. The RNAi mediated inhibition in DSB repair and chemo‑sensitization was proportional to the knockdown of the HR protein RAD51. Chemo‑sensitization was demonstrated for several HR proteins, in glioma cell lines proficient and mutated in p53. Evidence is provided showing that O6MeG is the primary lesion responsible for the increased sensitivity of glioblastoma cells following TMZ treatment, and that inhibition of the resistance marker MGMT restores the chemo‑sensitization achieved by HR down‑regulation. Data are also provided to show that inhibition of DNA‑PK dependent NHEJ does not significantly sensitized glioblastoma cells to TMZ treatment. Finally, the data also show that PARP inhibition with olaparib additionally sensitized HR down‑regulated glioma cells to TMZ. Collectively, the data show that processing of O6MeG through two rounds of DNA replication is required for DSB formation, checkpoint activation and apoptosis induction, and that O6MeG‑triggered apoptosis is also executed in subsequent generations. Furthermore, the data provide proof of principle evidence that down‑regulation of HR is a reasonable strategy for sensitizing glioma cells to killing by O6‑alkylating chemotherapeutics.
Resumo:
BACKGROUND: Faculties face the permanent challenge to design training programs with well-balanced educational outcomes, and to offer various organised and individual learning opportunities. AIM: To apply our original model to a postgraduate training program in rheumatology in general, and to various learning experiences in particular, in order to analyse the balance between different educational objectives. METHODS: Learning times of various educational activities were reported by the junior staff as targeted learners. The suitability of different learning experiences to achieve cognitive, affective and psychomotor learning objectives was estimated. Learning points with respect to efficacy were calculated by multiplication of the estimated learning times by the perceived appropriateness of the educational strategies. RESULTS: Out of 780 hours of professional learning per year (17.7 hours/week), 37.7% of the time was spent under individual supervision of senior staff, 24.4% in organised structured learning, 22.6% in self-studies, and 15.3% in organised patient-oriented learning. The balance between the different types of learning objectives was appropriate for the overall program, but not for each particular learning experience. Acquisition of factual knowledge and problem solving was readily aimed for during organised teaching sessions of different formats, and by personal targeted reading. Attitudes, skills and competencies, as well as behavioural and performance changes were mostly learned during caring for patients under interactive supervision by experts. CONCLUSION: We encourage other faculties to apply this approach to any other curriculum of undergraduate education, postgraduate training or continuous professional development in order to foster the development of well-balanced learning experiences.
Resumo:
The theory on the intensities of 4f-4f transitions introduced by B.R. Judd and G.S. Ofelt in 1962 has become a center piece in rare-earth optical spectroscopy over the past five decades. Many fundamental studies have since explored the physical origins of the Judd–Ofelt theory and have proposed numerous extensions to the original model. A great number of studies have applied the Judd–Ofelt theory to a wide range of rare-earth doped materials, many of them with important applications in solid-state lasers, optical amplifiers, phosphors for displays and solid state lighting, upconversion and quantum-cutting materials, and fluorescent markers. This paper takes the view of the experimentalist who is interested in appreciating the basic concepts, implications, assumptions, and limitations of the Judd–Ofelt theory in order to properly apply it to practical problems. We first present the formalism for calculating the wavefunctions of 4f electronic states in a concise form and then show their application to the calculation and fitting of 4f-4f transition intensities. The potential, limitations and pitfalls of the theory are discussed, and a detailed case study of LaCl3:Er3+ is presented.
Resumo:
The goal of this paper is to revisit the influential work of Mauro [1995] focusing on the strength of his results under weak identification. He finds a negative impact of corruption on investment and economic growth that appears to be robust to endogeneity when using two-stage least squares (2SLS). Since the inception of Mauro [1995], much literature has focused on 2SLS methods revealing the dangers of estimation and thus inference under weak identification. We reproduce the original results of Mauro [1995] with a high level of confidence and show that the instrument used in the original work is in fact 'weak' as defined by Staiger and Stock [1997]. Thus we update the analysis using a test statistic robust to weak instruments. Our results suggest that under Mauro's original model there is a high probability that the parameters of interest are locally almost unidentified in multivariate specifications. To address this problem, we also investigate other instruments commonly used in the corruption literature and obtain similar results.
Resumo:
In this comment, we chronicle the development and expansion of a postabortion care model designed to promote interventions that address abortion-related public health concerns even when abortion laws and policies are restrictive. We review years of program experience with the original model, which led to the development of an expanded and updated model, Essential Elements of Postabortion Care (PAC). Implementing the model challenges global public health leaders, donors, technical assistance agencies and ministries of health to work with communities to ensure that all women who want to prevent or space pregnancies can obtain contraceptive services; that all women have access to services to manage complications from abortion, whether induced or spontaneous; and that all women receiving treatment also receive counseling and the reproductive and other health services they need at the treatment visit, as well as follow-up care and contraceptive resupply