866 resultados para Cognitive Linguistics. Situation Models. Mental Simulation. Frames and Schemes
Resumo:
In linear mixed models, model selection frequently includes the selection of random effects. Two versions of the Akaike information criterion (AIC) have been used, based either on the marginal or on the conditional distribution. We show that the marginal AIC is no longer an asymptotically unbiased estimator of the Akaike information, and in fact favours smaller models without random effects. For the conditional AIC, we show that ignoring estimation uncertainty in the random effects covariance matrix, as is common practice, induces a bias that leads to the selection of any random effect not predicted to be exactly zero. We derive an analytic representation of a corrected version of the conditional AIC, which avoids the high computational cost and imprecision of available numerical approximations. An implementation in an R package is provided. All theoretical results are illustrated in simulation studies, and their impact in practice is investigated in an analysis of childhood malnutrition in Zambia.
Resumo:
To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.
Resumo:
This technical report discusses the application of the Lattice Boltzmann Method (LBM) and Cellular Automata (CA) simulation in fluid flow and particle deposition. The current work focuses on incompressible flow simulation passing cylinders, in which we incorporate the LBM D2Q9 and CA techniques to simulate the fluid flow and particle loading respectively. For the LBM part, the theories of boundary conditions are studied and verified using the Poiseuille flow test. For the CA part, several models regarding simulation of particles are explained. And a new Digital Differential Analyzer (DDA) algorithm is introduced to simulate particle motion in the Boolean model. The numerical results are compared with a previous probability velocity model by Masselot [Masselot 2000], which shows a satisfactory result.
Resumo:
In many complex and dynamic domains, the ability to generate and then select the appropriate course of action is based on the decision maker's "reading" of the situation--in other words, their ability to assess the situation and predict how it will evolve over the next few seconds. Current theories regarding option generation during the situation assessment and response phases of decision making offer contrasting views on the cognitive mechanisms that support superior performance. The Recognition-Primed Decision-making model (RPD; Klein, 1989) and Take-The-First heuristic (TTF; Johnson & Raab, 2003) suggest that superior decisions are made by generating few options, and then selecting the first option as the final one. Long-Term Working Memory theory (LTWM; Ericsson & Kintsch, 1995), on the other hand, posits that skilled decision makers construct rich, detailed situation models, and that as a result, skilled performers should have the ability to generate more of the available task-relevant options. The main goal of this dissertation was to use these theories about option generation as a way to further the understanding of how police officers anticipate a perpetrator's actions, and make decisions about how to respond, during dynamic law enforcement situations. An additional goal was to gather information that can be used, in the future, to design training based on the anticipation skills, decision strategies, and processes of experienced officers. Two studies were conducted to achieve these goals. Study 1 identified video-based law enforcement scenarios that could be used to discriminate between experienced and less-experienced police officers, in terms of their ability to anticipate the outcome. The discriminating scenarios were used as the stimuli in Study 2; 23 experienced and 26 less-experienced police officers observed temporally-occluded versions of the scenarios, and then completed assessment and response option-generation tasks. The results provided mixed support for the nature of option generation in these situations. Consistent with RPD and TTF, participants typically selected the first-generated option as their final one, and did so during both the assessment and response phases of decision making. Consistent with LTWM theory, participants--regardless of experience level--generated more task-relevant assessment options than task-irrelevant options. However, an expected interaction between experience level and option-relevance was not observed. Collectively, the two studies provide a deeper understanding of how police officers make decisions in dynamic situations. The methods developed and employed in the studies can be used to investigate anticipation and decision making in other critical domains (e.g., nursing, military). The results are discussed in relation to how they can inform future studies of option-generation performance, and how they could be applied to develop training for law enforcement officers.
Resumo:
Psychological models of mental disorders guide research into psychological and environmental factors that elicit and maintain mental disorders as well as interventions to reduce them. This paper addresses four areas. (1) Psychological models of mental disorders have become increasingly transdiagnostic, focusing on core cognitive endophenotypes of psychopathology from an integrative cognitive psychology perspective rather than offering explanations for unitary mental disorders. It is argued that psychological interventions for mental disorders will increasingly target specific cognitive dysfunctions rather than symptom-based mental disorders as a result. (2) Psychotherapy research still lacks a comprehensive conceptual framework that brings together the wide variety of findings, models and perspectives. Analysing the state-of-the-art in psychotherapy treatment research, “component analyses” aiming at an optimal identification of core ingredients and the mechanisms of change is highlighted as the core need towards improved efficacy and effectiveness of psychotherapy, and improved translation to routine care. (3) In order to provide more effective psychological interventions to children and adolescents, there is a need to develop new and/or improved psychotherapeutic interventions on the basis of developmental psychopathology research taking into account knowledge of mediators and moderators. Developmental neuroscience research might be instrumental to uncover associated aberrant brain processes in children and adolescents with mental health problems and to better examine mechanisms of their correction by means of psychotherapy and psychological interventions. (4) Psychotherapy research needs to broaden in terms of adoption of large-scale public health strategies and treatments that can be applied to more patients in a simpler and cost-effective way. Increased research on efficacy and moderators of Internet-based treatments and e-mental health tools (e.g. to support “real time” clinical decision-making to prevent treatment failure or relapse) might be one promising way forward.
Resumo:
Objective: To determine how a clinician’s background knowledge, their tasks, and displays of information interact to affect the clinician’s mental model. Design: Repeated Measure Nested Experimental Design Population, Sample, Setting: Populations were gastrointestinal/internal medicine physicians and nurses within the greater Houston area. A purposeful sample of 24 physicians and 24 nurses were studied in 2003. Methods: Subjects were randomized to two different displays of two different mock medical records; one that contained highlighted patient information and one that contained non-highlighted patient information. They were asked to read and summarize their understanding of the patients aloud. Propositional analysis was used to understand their comprehension of the patients. Findings: Different mental models were found between physicians and nurses given the same display of information. The information they shared was very minor compared to the variance in their mental models. There was additionally more variance within the nursing mental models than the physician mental models given different displays of the same information. Statistically, there was no interaction effect between the display of information and clinician type. Only clinician type could account for the differences in the clinician comprehension and thus their mental models of the cases. Conclusion: The factors that may explain the variance within and between the clinician models are clinician type, and only in the nursing group, the use of highlighting.
Resumo:
The body schema is a key component in accomplishing egocentric mental transformations, which rely on bodily reference frames. These reference frames are based on a plurality of different cognitive and sensory cues among which the vestibular system plays a prominent role. We investigated whether a bottom-up influence of vestibular stimulation modulates the ability to perform egocentric mental transformations. Participants were significantly faster to make correct spatial judgments during vestibular stimulation as compared to sham stimulation. Interestingly, no such effects were found for mental transformation of hand stimuli or during mental transformations of letters, thus showing a selective influence of vestibular stimulation on the rotation of whole-body reference frames. Furthermore, we found an interaction with the angle of rotation and vestibular stimulation demonstrating an increase in facilitation during mental body rotations in a direction congruent with rightward vestibular afferents. We propose that facilitation reflects a convergence in shared brain areas that process bottom-up vestibular signals and top-down imagined whole-body rotations, including the precuneus and tempero-parietal junction. Ultimately, our results show that vestibular information can influence higher-order cognitive processes, such as the body schema and mental imagery.
Resumo:
A growing body of evidence suggests a link between early childhood trauma, post-traumatic stress disorder (PTSD) and higher risk for dementia in old age. The aim of the present study was to investigate the association between childhood trauma exposure, PTSD and neurocognitive function in a unique cohort of former indentured Swiss child laborers in their late adulthood. To the best of our knowledge this is the first study ever conducted on former indentured child laborers and the first to investigate the relationship between childhood versus adulthood trauma and cognitive function. According to PTSD symptoms and whether they experienced childhood trauma (CT) or adulthood trauma (AT), participants (n = 96) were categorized as belonging to one of four groups: CT/PTSD+, CT/PTSD-, AT/PTSD+, AT/PTSD-. Information on cognitive function was assessed using the Structured Interview for Diagnosis of Dementia of Alzheimer Type, Multi-infarct Dementia and Dementia of other Etiology according to ICD-10 and DSM-III-R, the Mini-Mental State Examination, and a vocabulary test. Depressive symptoms were investigated as a potential mediator for neurocognitive functioning. Individuals screening positively for PTSD symptoms performed worse on all cognitive tasks compared to healthy individuals, independent of whether they reported childhood or adulthood adversity. When controlling for depressive symptoms, the relationship between PTSD symptoms and poor cognitive function became stronger. Overall, results tentatively indicate that PTSD is accompanied by cognitive deficits which appear to be independent of earlier childhood adversity. Our findings suggest that cognitive deficits in old age may be partly a consequence of PTSD or at least be aggravated by it. However, several study limitations need to considered. Consideration of cognitive deficits when treating PTSD patients and victims of lifespan trauma (even without a diagnosis of a psychiatric condition) is crucial. Furthermore, early intervention may prevent long-term deficits in memory function and development of dementia in adulthood.
Resumo:
N. Bostrom’s simulation argument and two additional assumptions imply that we are likely to live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and material models. Even though the computer hardware does provide a material model of the target, this does not suffice to underwrite the simulation argument because the ways in which parts of the computer hardware interact during simulations do not resemble the ways in which neurons interact in the brain. Further, there are computer simulations of all kinds of systems, and it would be unreasonable to infer that some computers display consciousness just because they simulate brains rather than, say, galaxies.
Resumo:
This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.
Resumo:
My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.
Resumo:
In the recent years the missing fourth component, the memristor, was successfully synthesized. However, the mathematical complexity and variety of the models behind this component, in addition to the existence of convergence problems in the simulations, make the design of memristor-based applications long and difficult. In this work we present a memristor model characterization framework which supports the automated generation of subcircuit files. The proposed environment allows the designer to choose and parameterize the memristor model that best suits for a given application. The framework carries out characterizing simulations in order to study the possible non-convergence problems, solving the dependence on the simulation conditions and guaranteeing the functionality and performance of the design. Additionally, the occurrence of undesirable effects related to PVT variations is also taken into account. By performing a Monte Carlo or a corner analysis, the designer is aware of the safety margins which assure the correct device operation.
Resumo:
In recent decades, full electric and hybrid electric vehicles have emerged as an alternative to conventional cars due to a range of factors, including environmental and economic aspects. These vehicles are the result of considerable efforts to seek ways of reducing the use of fossil fuel for vehicle propulsion. Sophisticated technologies such as hybrid and electric powertrains require careful study and optimization. Mathematical models play a key role at this point. Currently, many advanced mathematical analysis tools, as well as computer applications have been built for vehicle simulation purposes. Given the great interest of hybrid and electric powertrains, along with the increasing importance of reliable computer-based models, the author decided to integrate both aspects in the research purpose of this work. Furthermore, this is one of the first final degree projects held at the ETSII (Higher Technical School of Industrial Engineers) that covers the study of hybrid and electric propulsion systems. The present project is based on MBS3D 2.0, a specialized software for the dynamic simulation of multibody systems developed at the UPM Institute of Automobile Research (INSIA). Automobiles are a clear example of complex multibody systems, which are present in nearly every field of engineering. The work presented here benefits from the availability of MBS3D software. This program has proven to be a very efficient tool, with a highly developed underlying mathematical formulation. On this basis, the focus of this project is the extension of MBS3D features in order to be able to perform dynamic simulations of hybrid and electric vehicle models. This requires the joint simulation of the mechanical model of the vehicle, together with the model of the hybrid or electric powertrain. These sub-models belong to completely different physical domains. In fact the powertrain consists of energy storage systems, electrical machines and power electronics, connected to purely mechanical components (wheels, suspension, transmission, clutch…). The challenge today is to create a global vehicle model that is valid for computer simulation. Therefore, the main goal of this project is to apply co-simulation methodologies to a comprehensive model of an electric vehicle, where sub-models from different areas of engineering are coupled. The created electric vehicle (EV) model consists of a separately excited DC electric motor, a Li-ion battery pack, a DC/DC chopper converter and a multibody vehicle model. Co-simulation techniques allow car designers to simulate complex vehicle architectures and behaviors, which are usually difficult to implement in a real environment due to safety and/or economic reasons. In addition, multi-domain computational models help to detect the effects of different driving patterns and parameters and improve the models in a fast and effective way. Automotive designers can greatly benefit from a multidisciplinary approach of new hybrid and electric vehicles. In this case, the global electric vehicle model includes an electrical subsystem and a mechanical subsystem. The electrical subsystem consists of three basic components: electric motor, battery pack and power converter. A modular representation is used for building the dynamic model of the vehicle drivetrain. This means that every component of the drivetrain (submodule) is modeled separately and has its own general dynamic model, with clearly defined inputs and outputs. Then, all the particular submodules are assembled according to the drivetrain configuration and, in this way, the power flow across the components is completely determined. Dynamic models of electrical components are often based on equivalent circuits, where Kirchhoff’s voltage and current laws are applied to draw the algebraic and differential equations. Here, Randles circuit is used for dynamic modeling of the battery and the electric motor is modeled through the analysis of the equivalent circuit of a separately excited DC motor, where the power converter is included. The mechanical subsystem is defined by MBS3D equations. These equations consider the position, velocity and acceleration of all the bodies comprising the vehicle multibody system. MBS3D 2.0 is entirely written in MATLAB and the structure of the program has been thoroughly studied and understood by the author. MBS3D software is adapted according to the requirements of the applied co-simulation method. Some of the core functions are modified, such as integrator and graphics, and several auxiliary functions are added in order to compute the mathematical model of the electrical components. By coupling and co-simulating both subsystems, it is possible to evaluate the dynamic interaction among all the components of the drivetrain. ‘Tight-coupling’ method is used to cosimulate the sub-models. This approach integrates all subsystems simultaneously and the results of the integration are exchanged by function-call. This means that the integration is done jointly for the mechanical and the electrical subsystem, under a single integrator and then, the speed of integration is determined by the slower subsystem. Simulations are then used to show the performance of the developed EV model. However, this project focuses more on the validation of the computational and mathematical tool for electric and hybrid vehicle simulation. For this purpose, a detailed study and comparison of different integrators within the MATLAB environment is done. Consequently, the main efforts are directed towards the implementation of co-simulation techniques in MBS3D software. In this regard, it is not intended to create an extremely precise EV model in terms of real vehicle performance, although an acceptable level of accuracy is achieved. The gap between the EV model and the real system is filled, in a way, by introducing the gas and brake pedals input, which reflects the actual driver behavior. This input is included directly in the differential equations of the model, and determines the amount of current provided to the electric motor. For a separately excited DC motor, the rotor current is proportional to the traction torque delivered to the car wheels. Therefore, as it occurs in the case of real vehicle models, the propulsion torque in the mathematical model is controlled through acceleration and brake pedal commands. The designed transmission system also includes a reduction gear that adapts the torque coming for the motor drive and transfers it. The main contribution of this project is, therefore, the implementation of a new calculation path for the wheel torques, based on performance characteristics and outputs of the electric powertrain model. Originally, the wheel traction and braking torques were input to MBS3D through a vector directly computed by the user in a MATLAB script. Now, they are calculated as a function of the motor current which, in turn, depends on the current provided by the battery pack across the DC/DC chopper converter. The motor and battery currents and voltages are the solutions of the electrical ODE (Ordinary Differential Equation) system coupled to the multibody system. Simultaneously, the outputs of MBS3D model are the position, velocity and acceleration of the vehicle at all times. The motor shaft speed is computed from the output vehicle speed considering the wheel radius, the gear reduction ratio and the transmission efficiency. This motor shaft speed, somehow available from MBS3D model, is then introduced in the differential equations corresponding to the electrical subsystem. In this way, MBS3D and the electrical powertrain model are interconnected and both subsystems exchange values resulting as expected with tight-coupling approach.When programming mathematical models of complex systems, code optimization is a key step in the process. A way to improve the overall performance of the integration, making use of C/C++ as an alternative programming language, is described and implemented. Although this entails a higher computational burden, it leads to important advantages regarding cosimulation speed and stability. In order to do this, it is necessary to integrate MATLAB with another integrated development environment (IDE), where C/C++ code can be generated and executed. In this project, C/C++ files are programmed in Microsoft Visual Studio and the interface between both IDEs is created by building C/C++ MEX file functions. These programs contain functions or subroutines that can be dynamically linked and executed from MATLAB. This process achieves reductions in simulation time up to two orders of magnitude. The tests performed with different integrators, also reveal the stiff character of the differential equations corresponding to the electrical subsystem, and allow the improvement of the cosimulation process. When varying the parameters of the integration and/or the initial conditions of the problem, the solutions of the system of equations show better dynamic response and stability, depending on the integrator used. Several integrators, with variable and non-variable step-size, and for stiff and non-stiff problems are applied to the coupled ODE system. Then, the results are analyzed, compared and discussed. From all the above, the project can be divided into four main parts: 1. Creation of the equation-based electric vehicle model; 2. Programming, simulation and adjustment of the electric vehicle model; 3. Application of co-simulation methodologies to MBS3D and the electric powertrain subsystem; and 4. Code optimization and study of different integrators. Additionally, in order to deeply understand the context of the project, the first chapters include an introduction to basic vehicle dynamics, current classification of hybrid and electric vehicles and an explanation of the involved technologies such as brake energy regeneration, electric and non-electric propulsion systems for EVs and HEVs (hybrid electric vehicles) and their control strategies. Later, the problem of dynamic modeling of hybrid and electric vehicles is discussed. The integrated development environment and the simulation tool are also briefly described. The core chapters include an explanation of the major co-simulation methodologies and how they have been programmed and applied to the electric powertrain model together with the multibody system dynamic model. Finally, the last chapters summarize the main results and conclusions of the project and propose further research topics. In conclusion, co-simulation methodologies are applicable within the integrated development environments MATLAB and Visual Studio, and the simulation tool MBS3D 2.0, where equation-based models of multidisciplinary subsystems, consisting of mechanical and electrical components, are coupled and integrated in a very efficient way.
Resumo:
Partnering with families, school personnel, and community resources is an important step to supporting the child and family, especially when children might suffer from debilitating anxiety concerns. However, little research examines the impact of anxiety on math performance for young children participating in school-based interventions enhanced by family components. The following research questions were addressed in the study: 1a) Will a young child with elevated levels of anxiety show a decrease in anxiety symptoms with a Cognitive Behavioral framework intervention program for children? 1b) Will anxiety be reduced with the addition of a Conjoint Behavioral Consultation with the family and teacher? 2a) Will a young child show an increase in math performance after participation in a Cognitive Behavioral framework intervention program for children? 2b) Will math performance be increased with the addition of a Conjoint Behavioral Consultation with the family and teacher? A single-subject staggered baseline across situations intervention study addressed whether the Coping Cat, an evidenced-based child-focused intervention now widely used in schools and clinics to treat childhood anxiety, combined with family and school consultation will decrease elevated anxiety levels and improve math performance in an elementary-aged student. The objective was to support mental health development and math performance with an eight-year-old, female elementary student through a collaborative effort of stakeholders in the student's life. Baseline data was collected with repeated measures of anxiety and math performance, and was compared to two intervention phases: first, a child-focused intervention and second, a family and school consultation. The study tested the theory that the Cognitive Behavioral intervention and Conjoint Behavioral Consultation intervention will influence, positively, the anxiety levels and math performance for an elementary-aged student. Results indicate that the child participant with elevated levels of anxiety showed a reduction in symptoms with the introduction of a Cognitive Behavioral framework intervention when compared to her baseline data. The participant showed further reduction in symptoms across the school and home settings with the implementation of Conjoint Behavioral Consultation when compared to baseline and the first intervention phase. Math performance began to increase with the introduction of the Cognitive Behavioral intervention, and continued to improve with the implementation of the Conjoint Behavioral Consultation. Findings suggest that consultation should begin immediately when an intervention is implemented in order to enhance outcomes.
Resumo:
Objective: Healthy relationships between adolescents and their caregivers have been robustly associated with better youth outcomes in a variety of domains. Youth in contact with the child welfare system are at higher risk for worse outcomes including mental health problems and home placement instability. A growing body of literature points to youth mental health problems as both a predictor and a consequence of home placement instability in this population; the present study aimed to expand our understanding of these phenomena by examining the interplay among the caregiver-child relationship, youth mental health symptoms, and placement change over time. Method: The sample consisted of 1,179 youths aged 11-16, from the National Survey of Child and Adolescent Well-Being, a nationally representative sample of children in contact with the child welfare system. We used bivariate correlations and autoregressive cross-lagged path analysis to examine how youths’ reports of their externalizing and internalizing symptoms, their relationship with their caregivers, and placement changes reciprocally influenced one another over three time points. Results: In the overall models, early internalizing symptoms significantly negatively predicted the quality of the caregiver-child relationship at the next time point, and early externalizing symptoms predicted subsequent placement change. In addition, later externalizing symptoms negatively predicted subsequent reports of relationship quality, and later placement changes predicted subsequent externalizing problems; these relationships were significant only at the trend level (p < .10). The quality of the relationship was significantly negatively correlated with externalizing and internalizing problems at all time points, and all variables demonstrated autoregressive stability over time. Conclusions: Our findings support the importance of comprehensive interventions for youth in contact with the child welfare system, which target not only youth symptoms in isolation, but also the caregiver-child relationship, as a way to improve social-emotional outcomes in this high-risk population.