964 resultados para real-scale battery
Resumo:
In this thesis, elemental research towards the implantation of a diamond-based molecular quantum computer is presented. The approach followed requires linear alignment of endohedral fullerenes on the diamond C(100) surface in the vicinity of subsurface NV-centers. From this, four fundamental experimental challenges arise: 1) The well-controlled deposition of endohedral fullerenes on a diamond surface. 2) The creation of NV-centers in diamond close to the surface. 3) Preparation and characterization of atomically-flat diamondsurfaces. 4) Assembly of linear chains of endohedral fullerenes. First steps to overcome all these challenges were taken in the framework of this thesis. Therefore, a so-called “pulse injection” technique was implemented and tested in a UHV chamber that was custom-designed for this and further tasks. Pulse injection in principle allows for the deposition of molecules from solution onto a substrate and can therefore be used to deposit molecular species that are not stable to sublimation under UHV conditions, such as the endohedral fullerenes needed for a quantum register. Regarding the targeted creation of NV-centers, FIB experiments were carried out in cooperation with the group of Prof. Schmidt-Kaler (AG Quantum, Physics Department, Johannes Gutenberg-Universität Mainz). As an entry into this challenging task, argon cations were implanted into (111) surface-oriented CaF2 crystals. The resulting implantation spots on the surface were imaged and characterized using AFM. In this context, general relations between the impact of the ions on the surface and their valency or kinetic energy, respectively, could be established. The main part of this thesis, however, is constituted by NCAFM studies on both, bare and hydrogen-terminated diamond C(100) surfaces. In cooperation with the group of Prof. Dujardin (Molecular Nanoscience Group, ISMO, Université de Paris XI), clean and atomically-flat diamond surfaces were prepared by exposure of the substrate to a microwave hydrogen plasma. Subsequently, both surface modifications were imaged in high resolution with NC-AFM. In the process, both hydrogen atoms in the unit cell of the hydrogenated surface were resolved individually, which was not achieved in previous STM studies of this surface. The NC-AFM images also reveal, for the first time, atomic-resolution contrast on the clean, insulating diamond surface and provide real-space experimental evidence for a (2×1) surface reconstruction. With regard to the quantum computing concept, high-resolution NC-AFM imaging was also used to study the adsorption and self-assembly potential of two different kinds of fullerenes (C60 and C60F48) on aforementioned diamond surfaces. In case of the hydrogenated surface, particular attention was paid to the influence of charge transfer doping on the fullerene-substrate interaction and the morphology emerging from self-assembly. Finally, self-assembled C60 islands on the hydrogen-terminated diamond surface were subject to active manipulation by an NC-AFM tip. Two different kinds of tip-induced island growth modes have been induced and were presented. In conclusion, the results obtained provide fundamental informations mandatory for the realization of a molecular quantum computer. In the process it was shown that NC-AFM is, under proper circumstances, a very capable tool for imaging diamond surfaces with highest resolution, surpassing even what has been achieved with STM up to now. Particular attention was paid to the influence of transfer doping on the morphology of fullerenes on the hydrogenated diamond surface, revealing new possibilities for tailoring the self-assembly of molecules that have a high electron affinity.
Resumo:
Computing the weighted geometric mean of large sparse matrices is an operation that tends to become rapidly intractable, when the size of the matrices involved grows. However, if we are not interested in the computation of the matrix function itself, but just in that of its product times a vector, the problem turns simpler and there is a chance to solve it even when the matrix mean would actually be impossible to compute. Our interest is motivated by the fact that this calculation has some practical applications, related to the preconditioning of some operators arising in domain decomposition of elliptic problems. In this thesis, we explore how such a computation can be efficiently performed. First, we exploit the properties of the weighted geometric mean and find several equivalent ways to express it through real powers of a matrix. Hence, we focus our attention on matrix powers and examine how well-known techniques can be adapted to the solution of the problem at hand. In particular, we consider two broad families of approaches for the computation of f(A) v, namely quadrature formulae and Krylov subspace methods, and generalize them to the pencil case f(A\B) v. Finally, we provide an extensive experimental evaluation of the proposed algorithms and also try to assess how convergence speed and execution time are influenced by some characteristics of the input matrices. Our results suggest that a few elements have some bearing on the performance and that, although there is no best choice in general, knowing the conditioning and the sparsity of the arguments beforehand can considerably help in choosing the best strategy to tackle the problem.
Resumo:
Isochrysis galbana is a widely-used strain in aquaculture in spite of its low productivity. To maximize the productivity of processes based on this microalgae strain, a model was developed considering the influence of irradiance, temperature, pH and dissolved oxygen concentration on the photosynthesis and respiration rate. Results demonstrate that this strain tolerates temperatures up to 35ºC but it is highly sensitive to irradiances higher than 500 µE·m-2·s-1 and dissolved oxygen concentrations higher than 11 mg·l-1. With the researcher group of the “Universidad de Almeria”, the developed model was validated using data from an industrial-scale outdoor tubular photobioreactor demonstrating that inadequate temperature and dissolved oxygen concentrations reduce productivity to half that which is maximal, according to light availability under real outdoor conditions. The developed model is a useful tool for managing working processes, especially in the development of new processes based on this strain and to take decisions regarding optimal control strategies. Also the outdoor production of Isochrysis galbana T-iso in industrial size tubular photobioreactors (3.0 m3) has been studied. Experiments were performed modifying the dilution rate and evaluating the biomass productivity and quality, in addition to the overall performance of the system. Results confirmed that T-iso can be produced outdoor at commercial scale in continuous mode, productivities up to 20 g·m-2·day-1 of biomass rich in proteins (45%) and lipids (25%) being obtained. The utilization of this type of photobioreactors allows controlling the contamination and pH of the cultures, but daily variation of solar radiation imposes the existence of inadequate dissolved oxygen concentration and temperature at which the cells are exposed to inside the reactor. Excessive dissolved oxygen reduced the biomass productivity to 68% of maximal, whereas inadequate temperature reduces to 63% of maximal. Thus, optimally controlling these parameters the biomass productivity can be duplicated. These results confirm the potential to produce this valuable strain at commercial scale in optimally designed/operated tubular photobioreactors as a biotechnological industry.
Resumo:
Objectives: Previous research conducted in the late 1980s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over 25years old, the data are no longer representative of the currently installed barriers or the present US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if current full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. Methods: To characterize secondary collisions, 1,363 (596,331 weighted) real-world barrier midsection impacts selected from 13years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS) were analyzed. Scene diagram and available scene photographs were used to determine roadside and barrier specific variables unavailable in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. To investigate current secondary collision crash test criteria, 24 full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from National Cooperative Highway Research Program (NCHRP) Report 350. Results: Secondary collisions were found to occur in approximately two thirds of crashes where a barrier is the first object struck. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors to secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of 7 compared to cases with no second event present. The NCHRP Report 350 exit angle criterion was found to underestimate the risk of secondary collisions in real-world barrier crashes. Conclusions: Consistent with previous research, collisions following a barrier impact are not an infrequent event and substantially increase driver injury risk. The results suggest that using exit-angle based crash test criteria alone to assess secondary collision risk is not sufficient to predict second collision occurrence for real-world barrier crashes.
Resumo:
Previous research conducted in the late 1980’s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over twenty-five years old, the data used in the previous research is no longer representative of the currently installed barriers or US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. The analysis included 1,383 (596,331 weighted) real-world barrier midsection impacts selected from thirteen years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS). For each suitable case, the scene diagram and available scene photographs were used to determine roadside and barrier specific variables not available in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors toward secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of seven compared to cases with no second event present. Twenty-four full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from NCHRP Report 350. It was found that the NCHRP Report 350 exit angle criterion alone was not sufficient to predict second collision occurrence for real-world barrier crashes.
Resumo:
The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.
Resumo:
The purpose of this study was to evaluate the neuroimaging quality and accuracy of prospective real-time navigator-echo acquisition correction versus untriggered intrauterine magnetic resonance imaging (MRI) techniques. Twenty women in whom fetal motion artifacts compromised the neuroimaging quality of fetal MRI taken during the 28.7 +/- 4 week of pregnancy below diagnostic levels were additionally investigated using a navigator-triggered half-Fourier acquired single-shot turbo-spin echo (HASTE) sequence. Imaging quality was evaluated by two blinded readers applying a rating scale from 1 (not diagnostic) to 5 (excellent). Diagnostic criteria included depiction of the germinal matrix, grey and white matter, CSF, brain stem and cerebellum. Signal-difference-to-noise ratios (SDNRs) in the white matter and germinal zone were quantitatively evaluated. Imaging quality improved in 18/20 patients using the navigator echo technique (2.4 +/- 0.58 vs. 3.65 +/- 0.73 SD, p < 0.01 for all evaluation criteria). In 2/20 patients fetal movement severely impaired image quality in conventional and navigated HASTE. Navigator-echo imaging revealed additional structural brain abnormalities and confirmed diagnosis in 8/20 patients. The accuracy improved from 50% to 90%. Average SDNR increased from 0.7 +/- 7.27 to 19.83 +/- 15.71 (p < 0.01). Navigator-echo-based real-time triggering of fetal head movement is a reliable technique that can deliver diagnostic fetal MR image quality despite vigorous fetal movement.
Resumo:
This dissertation presents the competitive control methodologies for small-scale power system (SSPS). A SSPS is a collection of sources and loads that shares a common network which can be isolated during terrestrial disturbances. Micro-grids, naval ship electric power systems (NSEPS), aircraft power systems and telecommunication system power systems are typical examples of SSPS. The analysis and development of control systems for small-scale power systems (SSPS) lacks a defined slack bus. In addition, a change of a load or source will influence the real time system parameters of the system. Therefore, the control system should provide the required flexibility, to ensure operation as a single aggregated system. In most of the cases of a SSPS the sources and loads must be equipped with power electronic interfaces which can be modeled as a dynamic controllable quantity. The mathematical formulation of the micro-grid is carried out with the help of game theory, optimal control and fundamental theory of electrical power systems. Then the micro-grid can be viewed as a dynamical multi-objective optimization problem with nonlinear objectives and variables. Basically detailed analysis was done with optimal solutions with regards to start up transient modeling, bus selection modeling and level of communication within the micro-grids. In each approach a detail mathematical model is formed to observe the system response. The differential game theoretic approach was also used for modeling and optimization of startup transients. The startup transient controller was implemented with open loop, PI and feedback control methodologies. Then the hardware implementation was carried out to validate the theoretical results. The proposed game theoretic controller shows higher performances over traditional the PI controller during startup. In addition, the optimal transient surface is necessary while implementing the feedback controller for startup transient. Further, the experimental results are in agreement with the theoretical simulation. The bus selection and team communication was modeled with discrete and continuous game theory models. Although players have multiple choices, this controller is capable of choosing the optimum bus. Next the team communication structures are able to optimize the players’ Nash equilibrium point. All mathematical models are based on the local information of the load or source. As a result, these models are the keys to developing accurate distributed controllers.
Resumo:
In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.
Resumo:
Semi-active damping devices have been shown to be effective in mitigating unwanted vibrations in civil structures. These devices impart force indirectly through real-time alterations to structural properties. Simulating the complex behavior of these devices for laboratory-scale experiments is a major challenge. Commercial devices for seismic applications typically operate in the 2-10 kN range; this force is too high for small-scale testing applications where requirements typically range from 0-10 N. Several challenges must be overcome to produce damping forces at this level. In this study, a small-scale magneto-rheological (MR) damper utilizing a fluid absorbent metal foam matrix is developed and tested to accomplish this goal. This matrix allows magneto-rheological (MR) fluid to be extracted upon magnetic excitation in order to produce MR-fluid shear stresses and viscosity effects between an electromagnetic piston, the foam, and the damper housing. Dampers for uniaxial seismic excitation are traditionally positioned in the horizontal orientation allowing MR-fluid to gather in the lower part of the damper housing when partially filled. Thus, the absorbent matrix is placed in the bottom of the housing relieving the need to fill the entire device with MR-fluid, a practice that requires seals that add significant unwanted friction to the desired low-force device. The damper, once constructed, can be used in feedback control applications to reduce seismic vibrations and to test structural control algorithms and wireless command devices. To validate this device, a parametric study was performed utilizing force and acceleration measurements to characterize damper performance and controllability for this actuator. A discussion of the results is presented to demonstrate the attainment of the damper design objectives.
Resumo:
Even though complete resection is regarded as the only curative treatment for nonsmall cell lung cancer (NSCLC), >50% of resected patients die from a recurrence or a second primary tumour of the lung within 5 yrs. It remains unclear, whether follow-up in these patients is cost-effective and whether it can improve the outcome due to early detection of recurrent tumour. The benefit of regular follow-up in a consecutive series of 563 patients, who had undergone potentially curative resection for NSCLC at the University Hospital, was analysed. The follow-up consisted of clinical visits and chest radiography according to a standard protocol for up to 10 yrs. Survival rates were estimated using the Kaplan-Meier analysis method and the cost-effectiveness of the follow-up programme was assessed. A total of 23 patients (6.4% of the group with lobectomy) underwent further operation with curative intent for a second pulmonary malignancy. The regular follow-up over a 10-yr period provided the chance for a second curative treatment to 3.8% of all patients. The calculated costs per life-yr gained were 90,000 Swiss Francs. The cost-effectiveness of the follow-up protocol was far above those of comparable large-scale surveillance programmes. Based on these data, the intensity and duration of the follow-up was reduced.
Resumo:
We present redirection techniques that support exploration of large-scale virtual environments (VEs) by means of real walking. We quantify to what degree users can unknowingly be redirected in order to guide them through VEs in which virtual paths differ from the physical paths. We further introduce the concept of dynamic passive haptics by which any number of virtual objects can be mapped to real physical proxy props having similar haptic properties (i. e., size, shape, and surface structure), such that the user can sense these virtual objects by touching their real world counterparts. Dynamic passive haptics provides the user with the illusion of interacting with a desired virtual object by redirecting her to the corresponding proxy prop. We describe the concepts of generic redirected walking and dynamic passive haptics and present experiments in which we have evaluated these concepts. Furthermore, we discuss implications that have been derived from a user study, and we present approaches that derive physical paths which may vary from the virtual counterparts.
Resumo:
Semi-natural grasslands are widely recognized for their high ecological value. They often count among the most species-rich habitats, especially in traditional cultural landscapes. Maintaining and/or restoring them is a top priority, but nevertheless represents a real conservation challenge, especially regarding their invertebrate assemblages. The main goal of this study was to experimentally investigate the influence of four different mowing regimes on orthopteran communities and populations: (1) control meadow (C-meadow): mowing regime according to the Swiss regulations for extensively managed meadows declared as ecological compensation areas, i.e. first cut not before 15 June; (2) first cut not before 15 July (delayed treatment, D-meadow); (3) first cut not before 15 June and second cut not earlier than 8 weeks from the first cut (8W-meadow); (4) refuges left uncut on 10–20% of the meadow area (R-meadow). Data were collected two years after the introduction of these mowing treatments. Orthopteran densities from spring to early summer were five times higher in D-meadows, compared to C-meadows. In R-meadows, densities were, on average, twice as high as in C-meadows, while mean species richness was 23% higher in R-meadows than in C-meadows. Provided that farmers were given the appropriate financial incentives, the D- and R-meadow regimes could be relatively easy to implement within agri-environment schemes. Such meadows could deliver substantial benefits for functional biodiversity, including sustenance to many secondary consumers dependent on field invertebrates as staple food.
Resumo:
BACKGROUND CONTEXT The Swiss Federal Office of Public Health mandated a nationwide health technology assessment-registry for balloon kyphoplasty (BKP) for decision making on reimbursement of these interventions. The early results of the registry led to a permanent coverage of BKP by basic health insurance. The documentation was continued for further evidence generation. PURPOSE This analysis reports on the 1 year results of patients after BKP treatment. STUDY DESIGN Prospective multicenter observational case series. PATIENT SAMPLE The data on 625 cases with 819 treated vertebrae were documented from March 2005 to May 2012. OUTCOME MEASURES Surgeon-administered outcome instruments were primary intervention form for BKP and the follow-up form; patient self-reported measures were EuroQol-5D questionnaire, North American Spine Society outcome instrument /Core Outcome Measures Index (including visual analog scale), and a comorbidity questionnaire. Outcome measures were back pain, medication, quality of life (QoL), cement extrusions, and new fractures within the first postoperative year. METHODS Data were recorded preoperatively and at 3 to 6-month and 1-year follow-ups. Wilcoxon signed-rank test was used for comparison of pre- with postoperative measurements. Multivariate logistic regression was used to identify factors with a significant influence on the outcome. RESULTS Seventy percent of patients were women with mean age of 71 years (range, 18-91 years); mean age of men was 65 years (range, 15-93 years). Significant and clinically relevant reduction of back pain, improvement of QoL, and reduction of pain killer consumption was seen within the first postoperative year. Preoperative back pain decreased from 69.3 to 29.0 at 3 to 6-month and remained unchanged at 1-year follow-ups. Consequently, QoL improved from 0.23 to 0.71 and 0.75 at the same follow-up intervals. The overall vertebra-based cement extrusion rates with and without extrusions into intervertebral discs were 22.1% and 15.3%, respectively. Symptomatic cement extrusions with radiculopathy were five (0.8%). A new vertebral fracture within a year from the BKP surgery was observed in 18.4% of the patients. CONCLUSIONS The results of the largest observational study for BKP so far are consistent with published randomized trials and systematic reviews. In this routine health care setting, BKP is safe and effective in reducing pain, improving QoL, and lowering pain_killer consumption and has an acceptable rate of cement extrusions. Postoperative outcome results show clear and significant clinical improvement at early follow-up that remain stable during the first postoperative year.
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.