870 resultados para Time Use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkielma käyttää automaattista kuviontunnistusalgoritmia ja yleisiä kahden liukuvan keskiarvon leikkauspiste –sääntöjä selittääkseen Stuttgartin pörssissä toimivien yksityissijoittajien myynti-osto –epätasapainoa ja siten vastatakseen kysymykseen ”käyttävätkö yksityissijoittajat teknisen analyysin menetelmiä kaupankäyntipäätöstensä perustana?” Perusolettama sijoittajien käyttäytymisestä ja teknisen analyysin tuottavuudesta tehtyjen tutkimusten perusteella oli, että yksityissijoittajat käyttäisivät teknisen analyysin metodeja. Empiirinen tutkimus, jonka aineistona on DAX30 yhtiöiden data vuosilta 2009 – 2013, ei tuottanut riittävän selkeää vastausta tutkimuskysymykseen. Heikko todistusaineisto näyttää kuitenkin osoittavan, että yksityissijoittajat muuttavat kaupankäyntikäyttäytymistänsä eräiden kuvioiden ja leikkauspistesääntöjen ohjastamaan suuntaan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuron-specific enolase (NSE) is a glycolytic enzyme present almost exclusively in neurons and neuroendocrine cells. NSE levels in cerebrospinal fluid (CSF) are assumed to be useful to estimate neuronal injury and clinical outcome of patients with serious clinical manifestations such as those observed in stroke, head injury, anoxic encephalopathy, encephalitis, brain metastasis, and status epilepticus. We compared levels of NSE in serum (sNSE) and in CSF (cNSE) among four groups: patients with meningitis (N = 11), patients with encephalic injuries associated with impairment of consciousness (ENC, N = 7), patients with neurocysticercosis (N = 25), and normal subjects (N = 8). Albumin was determined in serum and CSF samples, and the albumin quotient was used to estimate blood-brain barrier permeability. The Glasgow Coma Scale score was calculated at the time of lumbar puncture and the Glasgow Outcome Scale (GOS) score was calculated at the time of patient discharge or death. The ENC group had significantly higher cNSE (P = 0.01) and albumin quotient (P = 0.005), but not sNSE (P = 0.14), levels than the other groups (Kruskal-Wallis test). Patients with lower GOS scores had higher cNSE levels (P = 0.035) than patients with favorable outcomes. Our findings indicate that sNSE is not sensitive enough to detect neuronal damage, but cNSE seems to be reliable for assessing patients with considerable neurological insult and cases with adverse outcome. However, one should be cautious about estimating the severity of neurological status as well as outcome based exclusively on cNSE in a single patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual environments and real-time simulators (VERS) are becoming more and more important tools in research and development (R&D) process of non-road mobile machinery (NRMM). The virtual prototyping techniques enable faster and more cost-efficient development of machines compared to use of real life prototypes. High energy efficiency has become an important topic in the world of NRMM because of environmental and economic demands. The objective of this thesis is to develop VERS based methods for research and development of NRMM. A process using VERS for assessing effects of human operators on the life-cycle efficiency of NRMM was developed. Human in the loop simulations are ran using an underground mining loader to study the developed process. The simulations were ran in the virtual environment of the Laboratory of Intelligent Machines of Lappeenranta University of Technology. A physically adequate real-time simulation model of NRMM was shown to be reliable and cost effective in testing of hardware components by the means of hardware-in-the-loop (HIL) simulations. A control interface connecting integrated electro-hydraulic energy converter (IEHEC) with virtual simulation model of log crane was developed. IEHEC consists of a hydraulic pump-motor and an integrated electrical permanent magnet synchronous motorgenerator. The results show that state of the art real-time NRMM simulators are capable to solve factors related to energy consumption and productivity of the NRMM. A significant variation between the test drivers is found. The results show that VERS can be used for assessing human effects on the life-cycle efficiency of NRMM. HIL simulation responses compared to that achieved with conventional simulation method demonstrate the advances and drawbacks of various possible interfaces between the simulator and hardware part of the system under study. Novel ideas for arranging the interface are successfully tested and compared with the more traditional one. The proposed process for assessing the effects of operators on the life-cycle efficiency will be applied for wider group of operators in the future. Driving styles of the operators can be analysed statistically from sufficient large result data. The statistical analysis can find the most life-cycle efficient driving style for the specific environment and machinery. The proposed control interface for HIL simulation need to be further studied. The robustness and the adaptation of the interface in different situations must be verified. The future work will also include studying the suitability of the IEHEC for different working machines using the proposed HIL simulation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the offset of a visual stimulus (GAP condition) precedes the onset of a target, saccadic reaction times are reduced in relation to the condition with no offset (overlap condition) - the GAP effect. However, the existence of the GAP effect for manual responses is still controversial. In two experiments using both simple (Experiment 1, N = 18) and choice key-press procedures (Experiment 2, N = 12), we looked for the GAP effect in manual responses and investigated possible contextual influences on it. Participants were asked to respond to the imperative stimulus that would occur under different experimental contexts, created by varying the array of warning-stimulus intervals (0, 300 and 1000 ms) and conditions (GAP and overlap): i) intervals and conditions were randomized throughout the experiment; ii) conditions were run in different blocks and intervals were randomized; iii) intervals were run in different blocks and conditions were randomized. Our data showed that no GAP effect was obtained for any manipulation. The predictability of stimulus occurrence produced the strongest influence on response latencies. In Experiment 1, simple manual responses were shorter when the intervals were blocked (247 ms, P < 0.001) in relation to the other two contexts (274 and 279 ms). Despite the use of choice key-press procedures, Experiment 2 produced a similar pattern of results. A discussion addressing the critical conditions to obtain the GAP effect for distinct motor responses is presented. In short, our data stress the relevance of the temporal allocation of attention for behavioral performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods have been described to measure intraocular pressure (IOP) in clinical and research situations. However, the measurement of time varying IOP with high accuracy, mainly in situations that alter corneal properties, has not been reported until now. The present report describes a computerized system capable of recording the transitory variability of IOP, which is sufficiently sensitive to reliably measure ocular pulse peak-to-peak values. We also describe its characteristics and discuss its applicability to research and clinical studies. The device consists of a pressure transducer, a signal conditioning unit and an analog-to-digital converter coupled to a video acquisition board. A modified Cairns trabeculectomy was performed in 9 Oryctolagus cuniculus rabbits to obtain changes in IOP decay parameters and to evaluate the utility and sensitivity of the recording system. The device was effective for the study of kinetic parameters of IOP, such as decay pattern and ocular pulse waves due to cardiac and respiratory cycle rhythm. In addition, there was a significant increase of IOP versus time curve derivative when pre- and post-trabeculectomy recordings were compared. The present procedure excludes corneal thickness and error related to individual operator ability. Clinical complications due to saline infusion and pressure overload were not observed during biomicroscopic evaluation. Among the disadvantages of the procedure are the requirement of anesthesia and the use in acute recordings rather than chronic protocols. Finally, the method described may provide a reliable alternative for the study of ocular pressure dynamic alterations in man and may facilitate the investigation of the pathogenesis of glaucoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis was to create a process for all multi-site ramp-up (MSRU) projects in the case company in order to have simultaneous ramp-ups early in the market. The research was done through case study in one company and semi-structured interviews. There are already processes, which are now in use in MSRU-cases. Interviews of 20 ramp-up specialists revealed topics to be improved. Those were project team set up, roles and responsibilities and recommended project organization, communication, product change management practices, competence and know how transfer practices and support model. More R&D support and involvement is needed in MSRU-projects. DCM’s role is very important in the MSRU-projects among PMT-team; he should be the business owner of the project. Recommendation is that product programs could take care of the product and repair training of new products in volume factories. R&D’s participation in competence transfers is essential important in MSRU-projects. Communication in projects could be shared through special intranet commune. Blogging and tweeting could be considered in the communication plan. If hundreds of change notes are open in ramp-up phase, it should be considered not to approve the product into volume ramp-up. PMTs’ supports are also important and MSRU-projects should be planned, budgeted and executed together. Finally a new MSRU-process is presented in this thesis to be used in all MSRU-projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hereditary hemochromatosis (HH) is a common autosomal disorder of iron metabolism mainly affecting Caucasian populations. Three recurrent disease-associated mutations have been detected in the hemochromatosis gene (HFE): C282Y, H63D, and S65C. Although HH phenotype has been associated with all three mutations, C282Y is considered the most relevant mutation responsible for hemochromatosis. Clinical complications of HH include cirrhosis of the liver, congestive cardiac failure and cardiac arrhythmias, endocrine pancreatic disease, which can be prevented by early diagnosis and treatment. Therefore, a reliable genotyping method is required for presymptomatic diagnosis. We describe the simultaneous detection of the C282Y, H63D and S65C mutations in the hemochromatosis gene by real-time PCR followed by melting curve analysis using fluorescence resonance energy transfer (FRET) probes. The acceptor fluorophore may be replaced by a quencher, increasing multiplex possibilities. Real-time PCR results were compared to the results of sequencing and conventional PCR followed by restriction digestion and detection by agarose gel electrophoresis (PCR-RFLP). Genotypes from 80 individuals obtained both by the conventional PCR-RFLP method and quenched-FRET real-time PCR were in full agreement. Sequencing also confirmed the results obtained by the new method, which proved to be an accurate, rapid and cost-effective diagnostic assay. Our findings demonstrate the usefulness of real-time PCR for the simultaneous detection of mutations in the HFE gene, which allows a reduction of a significant amount of time in sample processing compared to the PCR-RFLP method, eliminates the use of toxic reagents, reduces the risk of contamination in the laboratory, and enables full process automation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Brazil, street markets and vegetable distributors discard vegetable leaves and stems, including those of carrot (Dacus carota L.). Seeking to reduce the waste of vegetable parts, this study characterized chemically the leaves of organically grown carrot in three stages of development to determine the best time for their removal and consumption as food. The leaves were dehydrated in an oven at 70 °C for 43 hours and analyzed for chemical composition, antioxidant activity, chlorophyll content, fatty acid composition, and also calcium (Ca), sodium (Na), potassium (K), magnesium (Mg), manganese (Mn), iron (Fe), zinc (Zn), and copper (Cu) contents. The analyses indicated 100 days of development as the ideal stage for the removal and consumption of carrot leaves with good antioxidant activity requiring only 63.78 ± 0.5 mg.L-1 methanol leaf extract to inhibit 50% of the concentration of the free radical DPPH (2,2-diphenyl-1picrilidrazil), and total protein and alpha-linolenic acid (18:3 n-3/LNA) contents of 18.23% ± 2.8 and 876.55 ± 20.62 mg.100 g-1 of dry matter, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study assesses the storage temperature effect on the anthocyanins of pasteurized and unpasteurized açaí pulp. The data was obtained using a pasteurized and lyophilized pulp (PLP) to evaluate the temperature effect (0, 25, and 40 °C). Part of non-pasteurized frozen pulp (NPP) was pasteurized (NPP-P) at 90 °C for 30 seconds; both pulps were stored at 40 °C. The anthocyanin content reduction in the drink was evaluated from the half-life time (t1/2), activation energy (Ea), temperature quotient (Q10), and the reaction rate constant (k). The t1/2 of the PLP anthocyanins stored at 40 °C was 1.8 times less than that stored at 25 °C and 15 times less than that stored at 0 °C; therefore, the higher temperatures decreased the stability of anthocyanins. The pasteurization increased the t1/2 by 6.6 times (10.14 hours for NPP and 67.28 hours for NPP-P). The anthocyanin degradation on NPP-P followed a first order kinetic, while NPP followed a second order kinetic; thus it can be said that the pasteurization process can improve the preservation of anthocyanins in the pulp.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of four Sanitizers - peracetic acid, chlorhexidine, quaternary ammonium, and organic acids - was tested in this work using different bacteria recognized as a problem to meat industry, Salmonella sp., S. aureus, E. coli and L. monocytogenes. The effects of sanitizer concentration (0.2, 0.5, 0.6, 1.0, 1.1 and 1.4%), at different temperatures (10 and 45 °C) and contact time (2, 10, 15, 18 and 25 minutes) were evaluated. Tests in an industrial plant were also carried out considering previously obtained results. In a general way, peracetic acid presented higher efficiencies using low concentration (0.2%) and contact time (2 minutes) at 10 °C. The tests performed in industrial scale showed that peracetic acid presented a good performance in concentration and contact time lower than that suggested by the suppliers. The use of chlorhexidine and quaternary ammonium led to reasonable results at the indicated conditions, and organic acids were ineffective under concentration and contact time higher than those indicated by the suppliers in relation to Staphylococcus aureus. The results, in general, show that the choice for the most adequate sanitizer depends on the microorganism contaminant, the time available for sanitizer application, and also on the process cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electricity distribution sector will face significant changes in the future. Increasing reliability demands will call for major network investments. At the same time, electricity end-use is undergoing profound changes. The changes include future energy technologies and other advances in the field. New technologies such as microgeneration and electric vehicles will have different kinds of impacts on electricity distribution network loads. In addition, smart metering provides more accurate electricity consumption data and opportunities to develop sophisticated load modelling and forecasting approaches. Thus, there are both demands and opportunities to develop a new type of long-term forecasting methodology for electricity distribution. The work concentrates on the technical and economic perspectives of electricity distribution. The doctoral dissertation proposes a methodology to forecast electricity consumption in the distribution networks. The forecasting process consists of a spatial analysis, clustering, end-use modelling, scenarios and simulation methods, and the load forecasts are based on the application of automatic meter reading (AMR) data. The developed long-term forecasting process produces power-based load forecasts. By applying these results, it is possible to forecast the impacts of changes on electrical energy in the network, and further, on the distribution system operator’s revenue. These results are applicable to distribution network and business planning. This doctoral dissertation includes a case study, which tests the forecasting process in practice. For the case study, the most prominent future energy technologies are chosen, and their impacts on the electrical energy and power on the network are analysed. The most relevant topics related to changes in the operating environment, namely energy efficiency, microgeneration, electric vehicles, energy storages and demand response, are discussed in more detail. The study shows that changes in electricity end-use may have radical impacts both on electrical energy and power in the distribution networks and on the distribution revenue. These changes will probably pose challenges for distribution system operators. The study suggests solutions for the distribution system operators on how they can prepare for the changing conditions. It is concluded that a new type of load forecasting methodology is needed, because the previous methods are no longer able to produce adequate forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investing in mutual funds has become more popular than ever and the amount of money invested in mutual funds registered in Finland has hit its all-time high. Mutual funds provide a relatively low-cost method for private investors to invest in stock market and achieve diversified portfolios. In finance there is always a tradeoff between risk and return, where higher expected returns can usually be achieved only by taking higher risks. Diversifying the portfolio gets rid some of the risk but systematic risk cannot be diversified away. These risks can be managed by hedging the investments with derivatives. The use of derivatives should improve the performance of the portfolios using them compared to the funds that don’t. However, previous studies have shown that the risk exposure and return performance of derivative users does not considerably differ from nonusers. The purpose of this study is to examine how the use of derivatives affects the performance of equity funds. The funds studied were 155 equity funds registered in Finland in 2013. Empirical research was done by studying the derivative use of the funds during a 6-year period between 2008–2013. The performance of the funds was studied quantitatively by using several different performance measures used in mutual fund industry; Sharpe Ratio, Treynor Ratio, Jensen's alpha, Sortino Ratio, M2 and Omega Ratio. The effect of derivative use on funds' performance was studied by using a dummy variable and comparing performance measures of derivative-users and nonusers. The differences in performance measures between the two groups were analyzed with statistical tests. The hypothesis was that funds' derivative use should improve their performance relative to the funds that don't use them. The results of this study are in line with previous studies that state that the use of derivatives does not improve mutual funds' performance. When performance was measured with Jensen's alpha, funds that did not use derivatives performed better than the ones that used them. When measured with other performance measures, the results didn’t differ between two groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation describes an approach for developing a real-time simulation for working mobile vehicles based on multibody modeling. The use of multibody modeling allows comprehensive description of the constrained motion of the mechanical systems involved and permits real-time solving of the equations of motion. By carefully selecting the multibody formulation method to be used, it is possible to increase the accuracy of the multibody model while at the same time solving equations of motion in real-time. In this study, a multibody procedure based on semi-recursive and augmented Lagrangian methods for real-time dynamic simulation application is studied in detail. In the semirecursive approach, a velocity transformation matrix is introduced to describe the dependent coordinates into relative (joint) coordinates, which reduces the size of the generalized coordinates. The augmented Lagrangian method is based on usage of global coordinates and, in that method, constraints are accounted using an iterative process. A multibody system can be modelled as either rigid or flexible bodies. When using flexible bodies, the system can be described using a floating frame of reference formulation. In this method, the deformation mode needed can be obtained from the finite element model. As the finite element model typically involves large number of degrees of freedom, reduced number of deformation modes can be obtained by employing model order reduction method such as Guyan reduction, Craig-Bampton method and Krylov subspace as shown in this study The constrained motion of the working mobile vehicles is actuated by the force from the hydraulic actuator. In this study, the hydraulic system is modeled using lumped fluid theory, in which the hydraulic circuit is divided into volumes. In this approach, the pressure wave propagation in the hoses and pipes is neglected. The contact modeling is divided into two stages: contact detection and contact response. Contact detection determines when and where the contact occurs, and contact response provides the force acting at the collision point. The friction between tire and ground is modelled using the LuGre friction model, which describes the frictional force between two surfaces. Typically, the equations of motion are solved in the full matrices format, where the sparsity of the matrices is not considered. Increasing the number of bodies and constraint equations leads to the system matrices becoming large and sparse in structure. To increase the computational efficiency, a technique for solution of sparse matrices is proposed in this dissertation and its implementation demonstrated. To assess the computing efficiency, augmented Lagrangian and semi-recursive methods are implemented employing a sparse matrix technique. From the numerical example, the results show that the proposed approach is applicable and produced appropriate results within the real-time period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Older age increases the risk of developing a chronic atherosclerotic cardiovascular disease (CVD), such as coronary heart disease. Complications of CVDs, myocardial infarction or stroke often lead to loss of functional capacity or premature death. Dyslipidemia, high serum levels of total or low-density lipoprotein cholesterol (LDL-c) and low levels of high-density lipoprotein cholesterol (HDL-c), is among the most important modifiable risk factors for CVDs; it can be treated with lifestyle modifications, and with lipid-lowering drugs, primarily statins. In older persons, however, the association of cholesterol levels with cardiovascular and all-cause mortality has been inconsistent in previous studies. Furthermore, the beneficial effects of statins in older persons without previous CVD are still somewhat unclear, and older persons are more prone to adverse effects from statins. This thesis presents a prospective cohort study (TUVA), exploring associations of cholesterol levels with mortality and the changes in cholesterol levels of a 70-year-old population in long-term follow-up. Further, prevalence of CVDs, risk factors and preventive medication use in the TUVA cohort is compared with respective prevalences in another age-matched cohort (UTUVA) 20 years later in order to examine the changes in cardiovascular risk over time. Additionally, to evaluate statin use patterns among older persons, an observational register study was conducted covering the total Finnish population aged 70 and older during 2000-2008. Based on individual-level data retrieved from national health registries, the population was classified into low, moderate and high risk groups according to estimated CVD risk. The prevalence, incidence and persistence of statin use among the risk groups was then evaluated based upon yearly statin purchases tracked from the Prescription Register. The prospective cohort study demonstrated that low total cholesterol, LDL-c and HDL-c were associated with higher mortality in a cohort of home-dwelling 70-year-olds. However, after adjusting for traditional cardiovascular risk factors and cancer this association disappeared. Further, low total cholesterol seemed to be protective, whereas low HDL-c strongly predicted increased risk of CVD death. Cholesterol levels of those elderly who remained available for follow-up and were still home-dwelling at the age of 85 seemed to improve with advancing age. Compared to the TUVA cohort, the later born UTUVA cohort had less CVDs and their risk factors were better controlled, which was reflected in the higher use of preventive medications such as statins and antihypertensives. The register studies confirmed that statin use has increased significantly during 2000-2008 among older persons, especially among the oldest (80+) age groups and among those at high risk for cardiovascular events. Two-thirds of new statin users persisted with their use during the four years of follow-up; the most discontinuations were made during the first year of use. In conclusion, statins are commonly used among older age groups in Finland. Most of the older statin users had a high cardiovascular event risk, indicating that the treatment is well directed towards those who are likely to benefit from it the most. No age-limits should be put on the screening and treatment of dyslipidemia in older persons, but the benefits and adverse effects of statin treatment should be carefully weighed based on an individual assessment of the person’s general health status and functional capacity. Physicians should pay more attention to medication adherence, especially when prescribing preventive medications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.