949 resultados para Processing Time


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computerized soft-tissue simulation can provide unprecedented means for predicting facial outlook pre-operatively. Surgeons can virtually perform several surgical plans to have the best surgical results for their patients while considering corresponding soft-tissue outcome. It could be used as an interactive communication tool with their patients as well. There has been comprehensive amount of works for simulating soft-tissue for cranio-maxillofacial surgery. Although some of them have been realized as commercial products, none of them has been fully integrated into clinical practice due to the lack of accuracy and excessive amount of processing time. In this chapter, state-of-the-art and general workflow in facial soft-tissue simulation will be presented, along with an example of patient-specific facial soft-tissue simulation method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Geospatial information systems are used to analyze spatial data to provide decision makers with relevant, up-to-date, information. The processing time required for this information is a critical component to response time. Despite advances in algorithms and processing power, we still have many “human-in-the-loop” factors. Given the limited number of geospatial professionals, analysts using their time effectively is very important. The automation and faster humancomputer interactions of common tasks that will not disrupt their workflow or attention is something that is very desirable. The following research describes a novel approach to increase productivity with a wireless, wearable, electroencephalograph (EEG) headset within the geospatial workflow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Nonconvulsive status epilepticus (NCSE) is associated with a mortality rate of up to 18%, therefore requiring prompt diagnosis and treatment. Our aim was to evaluate the diagnostic value of perfusion CT (PCT) in the differential diagnosis of NCSE versus postictal states in patients presenting with persistent altered mental states after a preceding epileptic seizure. We hypothesized that regional cortical hyperperfusion can be measured by PCT in patients with NCSE, whereas it is not present in postictal states. MATERIALS AND METHODS: Nineteen patients with persistent altered mental status after a preceding epileptic seizure underwent PCT and electroencephalography (EEG). Patients were stratified as presenting with NCSE (n = 9) or a postictal state (n = 10) on the basis of clinical history and EEG data. Quantitative and visual analysis of the perfusion maps was performed. RESULTS: Patients during NCSE had significantly increased regional cerebral blood flow (P > .0001), increased regional cerebral blood volume (P > .001), and decreased (P > .001) mean transit time compared with the postictal state. Regional cortical hyperperfusion was depicted in 7/9 of patients with NCSE by ad hoc analysis of parametric perfusion maps during emergency conditions but was not a feature of postictal states. The areas of hyperperfusion were concordant with transient clinical symptoms and EEG topography in all cases. CONCLUSIONS: Visual analysis of perfusion maps detected regional hyperperfusion in NCSE with a sensitivity of 78%. The broad availability and short processing time of PCT in an emergency situation is a benefit compared with EEG. Consequently, the use of PCT in epilepsy may accelerate the diagnosis of NCSE. PCT may qualify as a complementary diagnostic tool to EEG in patients with persistent altered mental state after a preceding seizure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Master production schedule (MPS) plays an important role in an integrated production planning system. It converts the strategic planning defined in a production plan into the tactical operation execution. The MPS is also known as a tool for top management to control over manufacture resources and becomes input of the downstream planning levels such as material requirement planning (MRP) and capacity requirement planning (CRP). Hence, inappropriate decision on the MPS development may lead to infeasible execution, which ultimately causes poor delivery performance. One must ensure that the proposed MPS is valid and realistic for implementation before it is released to real manufacturing system. In practice, where production environment is stochastic in nature, the development of MPS is no longer simple task. The varying processing time, random event such as machine failure is just some of the underlying causes of uncertainty that may be hardly addressed at planning stage so that in the end the valid and realistic MPS is tough to be realized. The MPS creation problem becomes even more sophisticated as decision makers try to consider multi-objectives; minimizing inventory, maximizing customer satisfaction, and maximizing resource utilization. This study attempts to propose a methodology for MPS creation which is able to deal with those obstacles. This approach takes into account uncertainty and makes trade off among conflicting multi-objectives at the same time. It incorporates fuzzy multi-objective linear programming (FMOLP) and discrete event simulation (DES) for MPS development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Person-to-stock order picking is highly flexible and requires minimal investment costs in comparison to automated picking solutions. For these reasons, tradi-tional picking is widespread in distribution and production logistics. Due to its typically large proportion of manual activities, picking causes the highest operative personnel costs of all intralogistics process. The required personnel capacity in picking varies short- and mid-term due to capacity requirement fluctuations. These dynamics are often balanced by employing minimal permanent staff and using seasonal help when needed. The resulting high personnel fluctuation necessitates the frequent training of new pickers, which, in combination with in-creasingly complex work contents, highlights the im-portance of learning processes in picking. In industrial settings, learning is often quantified based on diminishing processing time and cost requirements with increasing experience. The best-known industrial learning curve models include those from Wright, de Jong, Baloff and Crossman, which are typically applied to the learning effects of an entire work crew rather than of individuals. These models have been validated in largely static work environments with homogeneous work contents. Little is known of learning effects in picking systems. Here, work contents are heterogeneous and individual work strategies vary among employees. A mix of temporary and steady employees with varying degrees of experience necessitates the observation of individual learning curves. In this paper, the individual picking performance development of temporary employees is analyzed and compared to that of steady employees in the same working environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The platform-independent software package consisting of the oligonucleotide mass assembler (OMA) and the oligonucleotide peak analyzer (OPA) was created to support the analysis of oligonucleotide mass spectra. It calculates all theoretically possible fragments of a given input sequence and annotates it to an experimental spectrum, thus, saving a large amount of manual processing time. The software performs analysis of precursor and product ion spectra of oligonucleotides and their analogues comprising user-defined modifications of the backbone, the nucleobases, or the sugar moiety, as well as adducts with metal ions or drugs. The ability to expand the library of building blocks and to implement individual structural variations makes it extremely useful for supporting the analysis of therapeutically active compounds. The functionality of the software tool is demonstrated on the examples of a platinated doublestranded oligonucleotide and a modified RNA sequence. Experiments also reveal the unique dissociation behavior of platinated higher-order DNA structures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A two-pronged approach for the automatic quantitation of multiple sclerosis (MS) lesions on magnetic resonance (MR) images has been developed. This method includes the design and use of a pulse sequence for improved lesion-to-tissue contrast (LTC) and seeks to identify and minimize the sources of false lesion classifications in segmented images. The new pulse sequence, referred to as AFFIRMATIVE (Attenuation of Fluid by Fast Inversion Recovery with MAgnetization Transfer Imaging with Variable Echoes), improves the LTC, relative to spin-echo images, by combining Fluid-Attenuated Inversion Recovery (FLAIR) and Magnetization Transfer Contrast (MTC). In addition to acquiring fast FLAIR/MTC images, the AFFIRMATIVE sequence simultaneously acquires fast spin-echo (FSE) images for spatial registration of images, which is necessary for accurate lesion quantitation. Flow has been found to be a primary source of false lesion classifications. Therefore, an imaging protocol and reconstruction methods are developed to generate "flow images" which depict both coherent (vascular) and incoherent (CSF) flow. An automatic technique is designed for the removal of extra-meningeal tissues, since these are known to be sources of false lesion classifications. A retrospective, three-dimensional (3D) registration algorithm is implemented to correct for patient movement which may have occurred between AFFIRMATIVE and flow imaging scans. Following application of these pre-processing steps, images are segmented into white matter, gray matter, cerebrospinal fluid, and MS lesions based on AFFIRMATIVE and flow images using an automatic algorithm. All algorithms are seamlessly integrated into a single MR image analysis software package. Lesion quantitation has been performed on images from 15 patient volunteers. The total processing time is less than two hours per patient on a SPARCstation 20. The automated nature of this approach should provide an objective means of monitoring the progression, stabilization, and/or regression of MS lesions in large-scale, multi-center clinical trials. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vector control is the mainstay of malaria control programmes. Successful vector control profoundly relies on accurate information on the target mosquito populations in order to choose the most appropriate intervention for a given mosquito species and to monitor its impact. An impediment to identify mosquito species is the existence of morphologically identical sibling species that play different roles in the transmission of pathogens and parasites. Currently PCR diagnostics are used to distinguish between sibling species. PCR based methods are, however, expensive, time-consuming and their development requires a priori DNA sequence information. Here, we evaluated an inexpensive molecular proteomics approach for Anopheles species: matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). MALDI-TOF MS is a well developed protein profiling tool for the identification of microorganisms but so far has received little attention as a diagnostic tool in entomology. We measured MS spectra from specimens of 32 laboratory colonies and 2 field populations representing 12 Anopheles species including the A. gambiae species complex. An important step in the study was the advancement and implementation of a bioinformatics approach improving the resolution over previously applied cluster analysis. Borrowing tools for linear discriminant analysis from genomics, MALDI-TOF MS accurately identified taxonomically closely related mosquito species, including the separation between the M and S molecular forms of A. gambiae sensu stricto. The approach also classifies specimens from different laboratory colonies; hence proving also very promising for its use in colony authentication as part of quality assurance in laboratory studies. While being exceptionally accurate and robust, MALDI-TOF MS has several advantages over other typing methods, including simple sample preparation and short processing time. As the method does not require DNA sequence information, data can also be reviewed at any later stage for diagnostic or functional patterns without the need for re-designing and re-processing biological material.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the roles of the right and left dorsolateral prefrontal (rDLPFC, lDLPFC) and the medial frontal cortex (MFC) in executive functioning using a theta burst transcranial magnetic stimulation (TMS) approach. Healthy subjects solved two visual search tasks: a number search task with low cognitive demands, and a number and letter search task with high cognitive demands. To observe how subjects solved the tasks, we assessed their behavior with and without TMS using eye movements when subjects were confronted with specific executive demands. To observe executive functions, we were particularly interested in TMS-induced changes in visual exploration strategies found to be associated with good or bad performance in a control condition without TMS stimulation. TMS left processing time unchanged in both tasks. Inhibition of the rDLPFC resulted in a decrease in anticipatory fixations in the number search task, i.e., a decrease in a good strategy in this low demand task. This was paired with a decrease in stimulus fixations. Together, these results point to a role of the rDLPFC in planning and response selection. Inhibition of the lDLPFC and the MFC resulted in an increase in anticipatory fixations in the number and letter search task, i.e., an increase in the application of a good strategy in this task. We interpret these results as a compensatory strategy to account for TMS-induced deficits in attentional switching when faced with high switching demands. After inhibition of the lDLPFC, an increase in regressive fixations was found in the number and letter search task. In the context of high working memory demands, this strategy appears to support TMS-induced working memory deficits. Combining an experimental TMS approach with the recording of eye movements proved sensitive to discrete decrements of executive functions and allows pinpointing the functional organization of the frontal lobes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cloudification of the Centralized-Radio Access Network (C-RAN) in which signal processing runs on general purpose processors inside virtual machines has lately received significant attention. Due to short deadlines in the LTE Frequency Division Duplex access method, processing time fluctuations introduced by the virtualization process have a deep impact on C-RAN performance. This paper evaluates bottlenecks of the OpenAirInterface (OAI is an open-source software-based implementation of LTE) cloud performance, provides feasibility studies on C-RAN execution, and introduces a cloud architecture that significantly reduces the encountered execution problems. In typical cloud environments, the OAI processing time deadlines cannot be guaranteed. Our proposed cloud architecture shows good characteristics for the OAI cloud execution. As an example, in our setup more than 99.5% processed LTE subframes reach reasonable processing deadlines close to performance of a dedicated machine.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En las últimas décadas hemos asistido a un renovado interés por la temprana recepción argentina de las categorías analíticas de Antonio Gramsci. Compartiendo ese interés, el presente trabajo se propone analizar las novedades introducidas por la lectura gramsciana de la Reforma Universitaria que propone Juan Carlos Portantiero en la década del setenta. Para ello, el trabajo comienza por ubicar las tesis de Estudiantes y política en América Latina. El proceso de la reforma universitaria (1918-1938) en el marco de las interpretaciones sobre la Reforma, para luego explicitar el proyecto político-intelectual en el que participa Portantiero en el momento de elaboración de su lectura, y concentrarse en las principales tesis del autor

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En las últimas décadas hemos asistido a un renovado interés por la temprana recepción argentina de las categorías analíticas de Antonio Gramsci. Compartiendo ese interés, el presente trabajo se propone analizar las novedades introducidas por la lectura gramsciana de la Reforma Universitaria que propone Juan Carlos Portantiero en la década del setenta. Para ello, el trabajo comienza por ubicar las tesis de Estudiantes y política en América Latina. El proceso de la reforma universitaria (1918-1938) en el marco de las interpretaciones sobre la Reforma, para luego explicitar el proyecto político-intelectual en el que participa Portantiero en el momento de elaboración de su lectura, y concentrarse en las principales tesis del autor

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En las últimas décadas hemos asistido a un renovado interés por la temprana recepción argentina de las categorías analíticas de Antonio Gramsci. Compartiendo ese interés, el presente trabajo se propone analizar las novedades introducidas por la lectura gramsciana de la Reforma Universitaria que propone Juan Carlos Portantiero en la década del setenta. Para ello, el trabajo comienza por ubicar las tesis de Estudiantes y política en América Latina. El proceso de la reforma universitaria (1918-1938) en el marco de las interpretaciones sobre la Reforma, para luego explicitar el proyecto político-intelectual en el que participa Portantiero en el momento de elaboración de su lectura, y concentrarse en las principales tesis del autor

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The simulation of interest rate derivatives is a powerful tool to face the current market fluctuations. However, the complexity of the financial models and the way they are processed require exorbitant computation times, what is in clear conflict with the need of a processing time as short as possible to operate in the financial market. To shorten the computation time of financial derivatives the use of hardware accelerators becomes a must.