807 resultados para Technology Acceptance Model TAM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Airbus GmbH (Hamburg) has been developed a new design of Rear Pressure Bulkhead (RPB) for the A320-family. The new model has been formed with vacuum forming technology. During this process the wrinkling phenomenon occurs. In this thesis is described an analytical model for prediction of wrinkling based on the energetic method of Timoshenko. Large deflection theory has been used for analyze two cases of study: a simply supported circular thin plate stamped by a spherical punch and a simply supported circular thin plate formed with vacuum forming technique. If the edges are free to displace radially, thin plates will develop radial wrinkles near the edge at a central deflection approximately equal to four plate thicknesses w0/ℎ≈4 if they’re stamped by a spherical punch and w0/ℎ≈3 if they’re formed with vacuum forming technique. Initially, there are four symmetrical wrinkles, but the number increases if the central deflection is increased. By using experimental results, the “Snaptrhough” phenomenon is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary goal of this work is related to the extension of an analytic electro-optical model. It will be used to describe single-junction crystalline silicon solar cells and a silicon/perovskite tandem solar cell in the presence of light-trapping in order to calculate efficiency limits for such a device. In particular, our tandem system is composed by crystalline silicon and a perovskite structure material: metilammoniumleadtriiodide (MALI). Perovskite are among the most convenient materials for photovoltaics thanks to their reduced cost and increasing efficiencies. Solar cell efficiencies of devices using these materials increased from 3.8% in 2009 to a certified 20.1% in 2014 making this the fastest-advancing solar technology to date. Moreover, texturization increases the amount of light which can be absorbed through an active layer. Using Green’s formalism it is possible to calculate the photogeneration rate of a single-layer structure with Lambertian light trapping analytically. In this work we go further: we study the optical coupling between the two cells in our tandem system in order to calculate the photogeneration rate of the whole structure. We also model the electronic part of such a device by considering the perovskite top cell as an ideal diode and solving the drift-diffusion equation with appropriate boundary conditions for the silicon bottom cell. We have a four terminal structure, so our tandem system is totally unconstrained. Then we calculate the efficiency limits of our tandem including several recombination mechanisms such as Auger, SRH and surface recombination. We focus also on the dependence of the results on the band gap of the perovskite and we calculare an optimal band gap to optimize the tandem efficiency. The whole work has been continuously supported by a numerical validation of out analytic model against Silvaco ATLAS which solves drift-diffusion equations using a finite elements method. Our goal is to develop a simpler and cheaper, but accurate model to study such devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model based calibration has gained popularity in recent years as a method to optimize increasingly complex engine systems. However virtually all model based techniques are applied to steady state calibration. Transient calibration is by and large an emerging technology. An important piece of any transient calibration process is the ability to constrain the optimizer to treat the problem as a dynamic one and not as a quasi-static process. The optimized air-handling parameters corresponding to any instant of time must be achievable in a transient sense; this in turn depends on the trajectory of the same parameters over previous time instances. In this work dynamic constraint models have been proposed to translate commanded to actually achieved air-handling parameters. These models enable the optimization to be realistic in a transient sense. The air handling system has been treated as a linear second order system with PD control. Parameters for this second order system have been extracted from real transient data. The model has been shown to be the best choice relative to a list of appropriate candidates such as neural networks and first order models. The selected second order model was used in conjunction with transient emission models to predict emissions over the FTP cycle. It has been shown that emission predictions based on air-handing parameters predicted by the dynamic constraint model do not differ significantly from corresponding emissions based on measured air-handling parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of CCSD(T) single-point calculations on MP4(SDQ) geometries and the W1 model chemistry method have been used to calculate ΔH° and ΔG° values for the deprotonation of 17 gas-phase reactions where the experimental values have reported accuracies within 1 kcal/mol. These values have been compared with previous calculations using the G3 and CBS model chemistries and two DFT methods. The most accurate CCSD(T) method uses the aug-cc-pVQZ basis set. Extrapolation of the aug-cc-pVTZ and aug-cc-pVQZ results yields the most accurate agreement with experiment, with a standard deviation of 0.58 kcal/mol for ΔG° and 0.70 kcal/mol for ΔH°. Standard deviations from experiment for ΔG° and ΔH° for the W1 method are 0.95 and 0.83 kcal/mol, respectively. The G3 and CBS-APNO results are competitive with W1 and are much less expensive. Any of the model chemistry methods or the CCSD(T)/aug-cc-pVQZ method can serve as a valuable check on the accuracy of experimental data reported in the National Institutes of Standards and Technology (NIST) database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in food transformation have dramatically increased the diversity of products on the market and, consequently, exposed consumers to a complex spectrum of bioactive nutrients whose potential risks and benefits have mostly not been confidently demonstrated. Therefore, tools are needed to efficiently screen products for selected physiological properties before they enter the market. NutriChip is an interdisciplinary modular project funded by the Swiss programme Nano-Tera, which groups scientists from several areas of research with the aim of developing analytical strategies that will enable functional screening of foods. The project focuses on postprandial inflammatory stress, which potentially contributes to the development of chronic inflammatory diseases. The first module of the NutriChip project is composed of three in vitro biochemical steps that mimic the digestion process, intestinal absorption, and subsequent modulation of immune cells by the bioavailable nutrients. The second module is a miniaturised form of the first module (gut-on-a-chip) that integrates a microfluidic-based cell co-culture system and super-resolution imaging technologies to provide a physiologically relevant fluid flow environment and allows sensitive real-time analysis of the products screened in vitro. The third module aims at validating the in vitro screening model by assessing the nutritional properties of selected food products in humans. Because of the immunomodulatory properties of milk as well as its amenability to technological transformation, dairy products have been selected as model foods. The NutriChip project reflects the opening of food and nutrition sciences to state-of-the-art technologies, a key step in the translation of transdisciplinary knowledge into nutritional advice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The Health Technology Assessment report on effectiveness, cost-effectiveness and appropriateness of homeopathy was compiled on behalf of the Swiss Federal Office for Public Health (BAG) within the framework of the 'Program of Evaluation of Complementary Medicine (PEK)'. Materials and Methods: Databases accessible by Internet were systematically searched, complemented by manual search and contacts with experts, and evaluated according to internal and external validity criteria. Results: Many high-quality investigations of pre-clinical basic research proved homeopathic high-potencies inducing regulative and specific changes in cells or living organisms. 20 of 22 systematic reviews detected at least a trend in favor of homeopathy. In our estimation 5 studies yielded results indicating clear evidence for homeopathic therapy. The evaluation of 29 studies in the domain 'Upper Respiratory Tract Infections/Allergic Reactions' showed a positive overall result in favor of homeopathy. 6 out of 7 controlled studies were at least equivalent to conventional medical interventions. 8 out of 16 placebocontrolled studies were significant in favor of homeopathy. Swiss regulations grant a high degree of safety due to product and training requirements for homeopathic physicians. Applied properly, classical homeopathy has few side-effects and the use of high-potencies is free of toxic effects. A general health-economic statement about homeopathy cannot be made from the available data. Conclusion: Taking internal and external validity criteria into account, effectiveness of homeopathy can be supported by clinical evidence and professional and adequate application be regarded as safe. Reliable statements of cost-effectiveness are not available at the moment. External and model validity will have to be taken more strongly into consideration in future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this project was to investigate the effect of using of data collection technology on student attitudes towards science instruction. The study was conducted over the course of two years at Madison High School in Adrian, Michigan, primarily in college preparatory physics classes, but also in one college preparatory chemistry class and one environmental science class. A preliminary study was conducted at a Lenawee County Intermediate Schools student summer environmental science day camp. The data collection technology used was a combination of Texas Instruments TI-84 Silver Plus graphing calculators and Vernier LabPro data collection sleds with various probeware attachments, including motion sensors, pH probes and accelerometers. Students were given written procedures for most laboratory activities and were provided with data tables and analysis questions to answer about the activities. The first year of the study included a pretest and posttest measuring student attitudes towards the class they were enrolled in. Pre-test and post-test data were analyzed to determine effect size, which was found to be very small (Coe, 2002). The second year of the study focused only on a physics class and used Keller’s ARCS model for measuring student motivation based on the four aspects of motivation: Attention, Relevance, Confidence and Satisfaction (Keller, 2010). According to this model, it was found that there were two distinct groups in the class, one of which was motivated to learn and the other that was not. The data suggest that the use of data collection technology in science classes should be started early in a student’s career, possibly in early middle school or late elementary. This would build familiarity with the equipment and allow for greater exploration by the student as they progress through high school and into upper level science courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High concentrations of fluoride naturally occurring in the ground water in the Arusha region of Tanzania cause dental, skeletal and non-skeletal fluorosis in up to 90% of the region’s population [1]. Symptoms of this incurable but completely preventable disease include brittle, discolored teeth, malformed bones and stiff and swollen joints. The consumption of high fluoride water has also been proven to cause headaches and insomnia [2] and adversely affect the development of children’s intelligence [3, 4]. Despite the fact that this array of symptoms may significantly impact a society’s development and the citizens’ ability to perform work and enjoy a reasonable quality of life, little is offered in the Arusha region in the form of solutions for the poor, those hardest hit by the problem. Multiple defluoridation technologies do exist, yet none are successfully reaching the Tanzanian public. This report takes a closer look at the efforts of one local organization, the Defluoridation Technology Project (DTP), to address the region’s fluorosis problem through the production and dissemination of bone char defluoridation filters, an appropriate technology solution that is proven to work. The goal of this research is to improve the sustainability of DTP’s operations and help them reach a wider range of clients so that they may reduce the occurrence of fluorosis more effectively. This was done first through laboratory testing of current products. Results of this testing show a wide range in uptake capacity across batches of bone char emphasizing the need to modify kiln design in order to produce a more consistent and high quality product. The issue of filter dissemination was addressed through the development of a multi-level, customerfunded business model promoting the availability of filters to Tanzanians of all socioeconomic levels. Central to this model is the recommendation to focus on community managed, institutional sized filters in order to make fluoride free water available to lower income clients and to increase Tanzanian involvement at the management level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selective catalytic reduction system is a well established technology for NOx emissions control in diesel engines. A one dimensional, single channel selective catalytic reduction (SCR) model was previously developed using Oak Ridge National Laboratory (ORNL) generated reactor data for an iron-zeolite catalyst system. Calibration of this model to fit the experimental reactor data collected at ORNL for a copper-zeolite SCR catalyst is presented. Initially a test protocol was developed in order to investigate the different phenomena responsible for the SCR system response. A SCR model with two distinct types of storage sites was used. The calibration process was started with storage capacity calculations for the catalyst sample. Then the chemical kinetics occurring at each segment of the protocol was investigated. The reactions included in this model were adsorption, desorption, standard SCR, fast SCR, slow SCR, NH3 Oxidation, NO oxidation and N2O formation. The reaction rates were identified for each temperature using a time domain optimization approach. Assuming an Arrhenius form of the reaction rates, activation energies and pre-exponential parameters were fit to the reaction rates. The results indicate that the Arrhenius form is appropriate and the reaction scheme used allows the model to fit to the experimental data and also for use in real world engine studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The single-electron transistor (SET) is one of the best candidates for future nano electronic circuits because of its ultralow power consumption, small size and unique functionality. SET devices operate on the principle of Coulomb blockade, which is more prominent at dimensions of a few nano meters. Typically, the SET device consists of two capacitively coupled ultra-small tunnel junctions with a nano island between them. In order to observe the Coulomb blockade effects in a SET device the charging energy of the device has to be greater that the thermal energy. This condition limits the operation of most of the existing SET devices to cryogenic temperatures. Room temperature operation of SET devices requires sub-10nm nano-islands due to the inverse dependence of charging energy on the radius of the conducting nano-island. Fabrication of sub-10nm structures using lithography processes is still a technological challenge. In the present investigation, Focused Ion Beam based etch and deposition technology is used to fabricate single electron transistors devices operating at room temperature. The SET device incorporates an array of tungsten nano-islands with an average diameter of 8nm. The fabricated devices are characterized at room temperature and clear Coulomb blockade and Coulomb oscillations are observed. An improvement in the resolution limitation of the FIB etching process is demonstrated by optimizing the thickness of the active layer. SET devices with structural and topological variation are developed to explore their impact on the behavior of the device. The threshold voltage of the device was minimized to ~500mV by minimizing the source-drain gap of the device to 17nm. Vertical source and drain terminals are fabricated to realize single-dot based SET device. A unique process flow is developed to fabricate Si dot based SET devices for better gate controllability in the device characteristic. The device vi parameters of the fabricated devices are extracted by using a conductance model. Finally, characteristic of these devices are validated with the simulated data from theoretical modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business strategy is important to all organizations. Nearly all Fortune 500 firms are implementing Enterprise Resource Planning (ERP) systems to improve the execution of their business strategy and to improve integration with its information technology (IT) strategy. Successful implementation of these multi-million dollar software systems are requiring new emphasis on change management and on Business and IT strategic alignment. This paper examines business and IT strategic alignment and seeks to explore whether an ERP implementation can drive business process reengineering and business and IT strategic alignment. An overview of business strategy and strategic alignment are followed by an analysis of ERP. The “As-Is/To-Be” process model is then presented and explained as a simple, but vital tool for improving business strategy, strategic alignment, and ERP implementation success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a system for 3-D reconstruction of a patient-specific surface model from calibrated X-ray images. Our system requires two X-ray images of a patient with one acquired from the anterior-posterior direction and the other from the axial direction. A custom-designed cage is utilized in our system to calibrate both images. Starting from bone contours that are interactively identified from the X-ray images, our system constructs a patient-specific surface model of the proximal femur based on a statistical model based 2D/3D reconstruction algorithm. In this paper, we present the design and validation of the system with 25 bones. An average reconstruction error of 0.95 mm was observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SWISSspine is a so-called pragmatic trial for assessment of safety and efficiency of total disc arthroplasty (TDA). It follows the new health technology assessment (HTA) principle of "coverage with evidence development". It is the first mandatory HTA registry of its kind in the history of Swiss orthopaedic surgery. Its goal is the generation of evidence for a decision by the Swiss federal office of health about reimbursement of the concerned technologies and treatments by the basic health insurance of Switzerland. During the time between March 2005 and 2008, 427 interventions with implantation of 497 lumbar total disc arthroplasties have been documented. Data was collected in a prospective, observational multicenter mode. The preliminary timeframe for the registry was 3 years and has already been extended. Data collection happens pre- and perioperatively, at the 3 months and 1-year follow-up and annually thereafter. Surgery, implant and follow-up case report forms are administered by spinal surgeons. Comorbidity questionnaires, NASS and EQ-5D forms are completed by the patients. Significant and clinically relevant reduction of low back pain VAS (70.3-29.4 points preop to 1-year postop, p < 0.0001) leg pain VAS (55.5-19.1 points preop to 1-year postop, p < 0.001), improvement of quality of life (EQ-5D, 0.32-0.73 points preop to 1-year postop, p < 0.001) and reduction of pain killer consumption was revealed at the 1-year follow-up. There were 14 (3.9%) complications and 7 (2.0%) revisions within the same hospitalization reported for monosegmental TDA; there were 6 (8.6%) complications and 8 (11.4%) revisions for bisegmental surgery. There were 35 patients (9.8%) with complications during followup in monosegmental and 9 (12.9%) in bisegmental surgery and 11 (3.1%) revisions with 1 [corrected] new hospitalization in monosegmental and 1 (1.4%) in bisegmental surgery. Regression analysis suggested a preoperative VAS "threshold value" of about 44 points for increased likelihood of a minimum clinically relevant back pain improvement. In a short-term perspective, lumbar TDA appears as a relatively safe and efficient procedure concerning pain reduction and improvement of quality of life. Nevertheless, no prediction about the long-term goals of TDA can be made yet. The SWISSspine registry proofs to be an excellent tool for collection of observational data in a nationwide framework whereby advantages and deficits of its design must be considered. It can act as a model for similar projects in other health-care domains.