867 resultados para Measurement-based quantum computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this research project was to develop a method to measure the performance of a winter maintenance program with respect to the task of providing safety and mobility to the travelling public. Developing these measures required a number of steps, each of which was accomplished. First, the impact of winter weather on safety (crash rates) and mobility (average vehicle speeds were measured by a combination of literature reviews and analysis of Iowa Department of Transportation traffic and Road Weather Information System data. Second, because not all winter storms are the same in their effects on safety and mobility, a method had to be developed to determine how much the various factors that describe a winter storm actually change safety and mobility. As part of this effort a storm severity index was developed, which ranks each winter storm on a scale between 0 (a very benign storm) and 1 (the worst imaginable storm). Additionally a number of methods of modeling the relationships between weather, winter maintenance actions and road surface conditions were developed and tested. The end result of this study was a performance measure based on average vehicle speed. For a given class of road, a maximum expected average speed reduction has been identified. For a given storm, this maximum expected average speed reduction is modified by the storm severity index to give a target average speed reduction. Thus, if for a given road the maximum expected average speed reduction is 20 mph, and the storm severity for a particular storm is 0.6, then the target average speed reduction for that road in that storm is 0.6 x 20 mph or 12 mph. If the average speed on that road during and after the storm is only 12 mph or less than the average speed on that road in good weather conditions, then the winter maintenance performance goal has been met.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Activity monitors based on accelerometry are used to predict the speed and energy cost of walking at 0% slope, but not at other inclinations. Parallel measurements of body accelerations and altitude variation were studied to determine whether walking speed prediction could be improved. Fourteen subjects walked twice along a 1.3 km circuit with substantial slope variations (-17% to +17%). The parameters recorded were body acceleration using a uni-axial accelerometer, altitude variation using differential barometry, and walking speed using satellite positioning (DGPS). Linear regressions were calculated between acceleration and walking speed, and between acceleration/altitude and walking speed. These predictive models, calculated using the data from the first circuit run, were used to predict speed during the second circuit. Finally the predicted velocity was compared with the measured one. The result was that acceleration alone failed to predict speed (mean r = 0.4). Adding altitude variation improved the prediction (mean r = 0.7). With regard to the altitude/acceleration-speed relationship, substantial inter-individual variation was found. It is concluded that accelerometry, combined with altitude measurement, can assess position variations of humans provided inter-individual variation is taken into account. It is also confirmed that DGPS can be used for outdoor walking speed measurements, opening up new perspectives in the field of biomechanics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of quantum dots (QDs) in the area of fingermark detection is currently receiving a lot of attention in the forensic literature. Most of the research efforts have been devoted to cadmium telluride (CdTe) quantum dots often applied as powders to the surfaces of interests. Both the use of cadmium and the nano size of these particles raise important issues in terms of health and safety. This paper proposes to replace CdTe QDs by zinc sulphide QDs doped with copper (ZnS:Cu) to address these issues. Zinc sulphide-copper doped QDs were successfully synthesized, characterized in terms of size and optical properties and optimized to be applied for the detection of impressions left in blood, where CdTe QDs proved to be efficient. Effectiveness of detection was assessed in comparison with CdTe QDs and Acid Yellow 7 (AY7, an effective blood reagent), using two series of depletive blood fingermarks from four donors prepared on four non-porous substrates, i.e. glass, transparent polypropylene, black polyethylene and aluminium foil. The marks were cut in half and processed separately with both reagents, leading to two comparison series (ZnS:Cu vs. CdTe, and ZnS:Cu vs. AY7). ZnS:Cu proved to be better than AY7 and at least as efficient as CdTe on most substrates. Consequently, copper-doped ZnS QDs constitute a valid substitute for cadmium-based QDs to detect blood marks on non-porous substrates and offer a safer alternative for routine use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: In this study, we investigated the structural plasticity of the contralesional motor network in ischemic stroke patients using diffusion magnetic resonance imaging (MRI) and explored a model that combines a MRI-based metric of contralesional network integrity and clinical data to predict functional outcome at 6 months after stroke. METHODS: MRI and clinical examinations were performed in 12 patients in the acute phase, at 1 and 6 months after stroke. Twelve age- and gender-matched controls underwent 2 MRIs 1 month apart. Structural remodeling after stroke was assessed using diffusion MRI with an automated measurement of generalized fractional anisotropy (GFA), which was calculated along connections between contralesional cortical motor areas. The predictive model of poststroke functional outcome was computed using a linear regression of acute GFA measures and the clinical assessment. RESULTS: GFA changes in the contralesional motor tracts were found in all patients and differed significantly from controls (0.001 ≤ p < 0.05). GFA changes in intrahemispheric and interhemispheric motor tracts correlated with age (p ≤ 0.01); those in intrahemispheric motor tracts correlated strongly with clinical scores and stroke sizes (p ≤ 0.001). GFA measured in the acute phase together with a routine motor score and age were a strong predictor of motor outcome at 6 months (r(2) = 0.96, p = 0.0002). CONCLUSION: These findings represent a proof of principle that contralesional diffusion MRI measures may provide reliable information for personalized rehabilitation planning after ischemic motor stroke. Neurology® 2012;79:39-46.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetization versus temperature in the temperature interval 2-200 K was measured for amorphous alloys of three different compositions: Fe 81.5B14.5Si4, Fe40Ni38 Mo4B18, and Co70Fe5Ni 2Mo3B5Si15. The measurements were performed by means of a SQUID (superconducting quantum interference device) magnetometer. The aim was to extract information about the different mechanisms contributing to thermal demagnetization. A powerful data analysis technique based on successive minimization procedures has demonstrated that Stoner excitations of the strong ferromagnetic type play a significant role in the Fe-Ni alloy studied. The Fe-rich and Co-rich alloys do not show a measurable contribution from single-particle excitations.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the application of a real-time quantitative PCR assay, previously developed to measure relative telomere length in humans and mice, to two bird species, the zebra finch Taeniopygia guttata and the Alpine swift Apus melba. This technique is based on the PCR amplification of telomeric (TTAGGG)(n) sequences using specific oligonucleotide primers. Relative telomere length is expressed as the ratio (T/S) of telomere repeat copy number (T) to control single gene copy number (S). This method is particularly useful for comparisons of individuals within species, or where the same individuals are followed longitudinally. We used glyceraldehyde-3-phosphate dehydrogenase (GAPDH) as a single control gene. In both species, we validated our PCR measurements of relative telomere length against absolute measurements of telomere length determined by the conventional method of quantifying telomere terminal restriction fragment (TRF) lengths using both the traditional Southern blot analysis (Alpine swifts) and in gel hybridization (zebra finches). As found in humans and mice, telomere lengths in the same sample measured by TRF and PCR were well correlated in both the Alpine swift and the zebra finch.. Hence, this PCR assay for measurement of bird telomeres, which is fast and requires only small amounts of genomic DNA, should open new avenues in the study of environmental factors influencing variation in telomere length, and how this variation translates into variation in cellular and whole organism senescence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work are reported investigations of structural, magnetic and electronic properties of GaAs/Ga1-xInxAs/GaAs quantum wells (QW) having a 0.5 - 1.8 monolayer thick Mn layer, separated from the quantum well by a 3 nm thick spacer. The structure of the samples is analyzed in details by photoluminescence and high-resolution X-ray difractometry and reflectometry, confirming that Mn atoms are practically absent from the QW. Transport properties and crystal structure are analyzed for the first time for this type of QW structures with so high mobility. Observedconductivity and the Hall effect in quantizing magnetic fields in wide temperature range, defined by transport of holes in the quantum well, demonstrate properties inherent to ferromagnetic systems with spin polarization of charge carriersin the QW. Investigation of the Shubnikov ¿ de Haas and the Hall effects gave the possibility to estimate the energy band parameters such as cyclotron mass andFermi level and calculate concentrations and mobilities of holes and show the high-quality of structures. Magnetic ordering is confirmed by the existence of the anomalous Hall effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a research concerning the conversion of non-accessible web pages containing mathematical formulae into accessible versions through an OCR (Optical Character Recognition) tool. The objective of this research is twofold. First, to establish criteria for evaluating the potential accessibility of mathematical web sites, i.e. the feasibility of converting non-accessible (non-MathML) math sites into accessible ones (Math-ML). Second, to propose a data model and a mechanism to publish evaluation results, making them available to the educational community who may use them as a quality measurement for selecting learning material.Results show that the conversion using OCR tools is not viable for math web pages mainly due to two reasons: many of these pages are designed to be interactive, making difficult, if not almost impossible, a correct conversion; formula (either images or text) have been written without taking into account standards of math writing, as a consequence OCR tools do not properly recognize math symbols and expressions. In spite of these results, we think the proposed methodology to create and publish evaluation reports may be rather useful in other accessibility assessment scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research report presents an application of systems theory to evaluating intellectual capital (IC) as organization's ability for self-renewal. As renewal ability is a dynamic capability of an organization as a whole, rather than a static asset or an atomistic competence of separate individuals within the organization, it needs to be understood systemically. Consequently, renewal ability has to be measured with systemic methods that are based on a thorough conceptual analysis of systemic characteristics of organizations. The aim of this report is to demonstrate the theory and analysis methodology for grasping companies' systemic efficiency and renewal ability. The volume is divided into three parts. The first deals with the theory of organizations as self-renewing systems. In the second part, the principles of quantitative analysis of organizations are laid down. Finally, the detailed mathematics of the renewal indices are presented. We also assert that the indices produced by the analysis are an effective tool for the management and valuation of knowledge-intensive companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tehoelektoniikkalaitteella tarkoitetaan ohjaus- ja säätöjärjestelmää, jolla sähköä muokataan saatavilla olevasta muodosta haluttuun uuteen muotoon ja samalla hallitaan sähköisen tehon virtausta lähteestä käyttökohteeseen. Tämä siis eroaa signaalielektroniikasta, jossa sähköllä tyypillisesti siirretään tietoa hyödyntäen eri tiloja. Tehoelektroniikkalaitteita vertailtaessa katsotaan yleensä niiden luotettavuutta, kokoa, tehokkuutta, säätötarkkuutta ja tietysti hintaa. Tyypillisiä tehoelektroniikkalaitteita ovat taajuudenmuuttajat, UPS (Uninterruptible Power Supply) -laitteet, hitsauskoneet, induktiokuumentimet sekä erilaiset teholähteet. Perinteisesti näiden laitteiden ohjaus toteutetaan käyttäen mikroprosessoreja, ASIC- (Application Specific Integrated Circuit) tai IC (Intergrated Circuit) -piirejä sekä analogisia säätimiä. Tässä tutkimuksessa on analysoitu FPGA (Field Programmable Gate Array) -piirien soveltuvuutta tehoelektroniikan ohjaukseen. FPGA-piirien rakenne muodostuu erilaisista loogisista elementeistä ja niiden välisistä yhdysjohdoista.Loogiset elementit ovat porttipiirejä ja kiikkuja. Yhdysjohdot ja loogiset elementit ovat piirissä kiinteitä eikä koostumusta tai lukumäärää voi jälkikäteen muuttaa. Ohjelmoitavuus syntyy elementtien välisistä liitännöistä. Piirissä on lukuisia, jopa miljoonia kytkimiä, joiden asento voidaan asettaa. Siten piirin peruselementeistä voidaan muodostaa lukematon määrä erilaisia toiminnallisia kokonaisuuksia. FPGA-piirejä on pitkään käytetty kommunikointialan tuotteissa ja siksi niiden kehitys on viime vuosina ollut nopeaa. Samalla hinnat ovat pudonneet. Tästä johtuen FPGA-piiristä on tullut kiinnostava vaihtoehto myös tehoelektroniikkalaitteiden ohjaukseen. Väitöstyössä FPGA-piirien käytön soveltuvuutta on tutkittu käyttäen kahta vaativaa ja erilaista käytännön tehoelektroniikkalaitetta: taajuudenmuuttajaa ja hitsauskonetta. Molempiin testikohteisiin rakennettiin alan suomalaisten teollisuusyritysten kanssa soveltuvat prototyypit,joiden ohjauselektroniikka muutettiin FPGA-pohjaiseksi. Lisäksi kehitettiin tätä uutta tekniikkaa hyödyntävät uudentyyppiset ohjausmenetelmät. Prototyyppien toimivuutta verrattiin vastaaviin perinteisillä menetelmillä ohjattuihin kaupallisiin tuotteisiin ja havaittiin FPGA-piirien mahdollistaman rinnakkaisen laskennantuomat edut molempien tehoelektroniikkalaitteiden toimivuudessa. Työssä on myösesitetty uusia menetelmiä ja työkaluja FPGA-pohjaisen säätöjärjestelmän kehitykseen ja testaukseen. Esitetyillä menetelmillä tuotteiden kehitys saadaan mahdollisimman nopeaksi ja tehokkaaksi. Lisäksi työssä on kehitetty FPGA:n sisäinen ohjaus- ja kommunikointiväylärakenne, joka palvelee tehoelektroniikkalaitteiden ohjaussovelluksia. Uusi kommunikointirakenne edistää lisäksi jo tehtyjen osajärjestelmien uudelleen käytettävyyttä tulevissa sovelluksissa ja tuotesukupolvissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.