994 resultados para reasonable time


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We shall examine a model, first studied by Brockwell et al. [Adv Appl Probab 14 (1982) 709.], which can be used to describe the longterm behaviour of populations that are subject to catastrophic mortality or emigration events. Populations can suffer dramatic declines when disease, such as an introduced virus, affects the population, or when food shortages occur, due to overgrazing or fluctuations in rainfall. However, perhaps surprisingly, such populations can survive for long periods and, although they may eventually become extinct, they can exhibit an apparently stationary regime. It is useful to be able to model this behaviour. This is particularly true of the ecological examples that motivated the present study, since, in order to properly manage these populations, it is necessary to be able to predict persistence times and to estimate the conditional probability distribution of population size. We shall see that although our model predicts eventual extinction, the time till extinction can be long and the stationary exhibited by these populations over any reasonable time scale can be explained using a quasistationary distribution. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is concerned with evaluating the performance of loss networks. Accurate determination of loss network performance can assist in the design and dimen- sioning of telecommunications networks. However, exact determination can be difficult and generally cannot be done in reasonable time. For these reasons there is much interest in developing fast and accurate approximations. We develop a reduced load approximation that improves on the famous Erlang fixed point approximation (EFPA) in a variety of circumstances. We illustrate our results with reference to a range of networks for which the EFPA may be expected to perform badly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The field of action for rehabilitation is that of making use of the patient's maximum functional capacity with the purpose of adapting to life in relation to the environment. Rehabilitation must commence immediately, although it may be in different forms from the acute phase to sequelae. It is considered appropriate to call the physiatrist as soon as the neurologic condition has stabilised. A list is made of the measures to be taken for rehabilitation in the acute phase and sequelae, and the composition of the rehabilitation team is described. In what concerns location, where to rehabilitate the patient? The group of ambulatory patients should have their rehabilitation as outpatients. Our experience with house calls is briefly described. The group of patients who cannot walk, those that present an eminently motor condition, with the possibility of being able to walk, should be with their families, with transport provided to health and rehabilitation centres. The second group, with the capacity of walking within a reasonable time, especially if with multiple associated problems such as impaired communication, should be hospitalised in a rehabilitation department. The third group consists of severely handicapped patients, for whom a solution must be found that provides life with a minimum of dignity in centres or homes. From among the measures to be introduced, we point out following: acquisition of transport for patients who must travel, as outpatients, to the department; providing family doctors with complete freedom to refer their patients to rehabilitation centres.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter aims at developing a taxonomic framework to classify the studies on the flexible job shop scheduling problem (FJSP). The FJSP is a generalization of the classical job shop scheduling problem (JSP), which is one of the oldest NP-hard problems. Although various solution methodologies have been developed to obtain good solutions in reasonable time for FSJPs with different objective functions and constraints, no study which systematically reviews the FJSP literature has been encountered. In the proposed taxonomy, the type of study, type of problem, objective, methodology, data characteristics, and benchmarking are the main categories. In order to verify the proposed taxonomy, a variety of papers from the literature are classified. Using this classification, several inferences are drawn and gaps in the FJSP literature are specified. With the proposed taxonomy, the aim is to develop a framework for a broad view of the FJSP literature and construct a basis for future studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba solutions have been standardised using a 4πβ-4πγ coincidence counting system we have recently set up. The detection in the beta channel is performed using various geometries of a UPS-89 plastic scintillator optically coupled to a selected low-noise 1in. diameter photomultiplier tube. The light-tight thin capsule that encloses this beta detector is housed within the well of a 5in.×5in. NaI(Tl) monocrystal detector. The beta detection efficiency can be varied either by optical filtering or electronic discrimination when the electrons loose all their energy in the plastic scintillator. This 4πβ-4πγ coincidence system improves on our 4πβ(PC)-γ system in that its sample preparation is less labour intensive, it yields larger beta- and gamma-counting efficiencies thus enabling the standardisation of low activity sources with good statistics in reasonable time, and it makes standardising short-lived radionuclides easier. The resulting radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba are found to agree with those measured with other primary measurement methods thus validating our 4πβ-4πγ coincidence counting system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Transient balanced steady-state free-precession (bSSFP) has shown substantial promise for noninvasive assessment of coronary arteries but its utilization at 3.0 T and above has been hampered by susceptibility to field inhomogeneities that degrade image quality. The purpose of this work was to refine, implement, and test a robust, practical single-breathhold bSSFP coronary MRA sequence at 3.0 T and to test the reproducibility of the technique. METHODS: A 3D, volume-targeted, high-resolution bSSFP sequence was implemented. Localized image-based shimming was performed to minimize inhomogeneities of both the static magnetic field and the radio frequency excitation field. Fifteen healthy volunteers and three patients with coronary artery disease underwent examination with the bSSFP sequence (scan time = 20.5 ± 2.0 seconds), and acquisitions were repeated in nine subjects. The images were quantitatively analyzed using a semi-automated software tool, and the repeatability and reproducibility of measurements were determined using regression analysis and intra-class correlation coefficient (ICC), in a blinded manner. RESULTS: The 3D bSSFP sequence provided uniform, high-quality depiction of coronary arteries (n = 20). The average visible vessel length of 100.5 ± 6.3 mm and sharpness of 55 ± 2% compared favorably with earlier reported navigator-gated bSSFP and gradient echo sequences at 3.0 T. Length measurements demonstrated a highly statistically significant degree of inter-observer (r = 0.994, ICC = 0.993), intra-observer (r = 0.894, ICC = 0.896), and inter-scan concordance (r = 0.980, ICC = 0.974). Furthermore, ICC values demonstrated excellent intra-observer, inter-observer, and inter-scan agreement for vessel diameter measurements (ICC = 0.987, 0.976, and 0.961, respectively), and vessel sharpness values (ICC = 0.989, 0.938, and 0.904, respectively). CONCLUSIONS: The 3D bSSFP acquisition, using a state-of-the-art MR scanner equipped with recently available technologies such as multi-transmit, 32-channel cardiac coil, and localized B0 and B1+ shimming, allows accelerated and reproducible multi-segment assessment of the major coronary arteries at 3.0 T in a single breathhold. This rapid sequence may be especially useful for functional imaging of the coronaries where the acquisition time is limited by the stress duration and in cases where low navigator-gating efficiency prohibits acquisition of a free breathing scan in a reasonable time period.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since the development of the first whole-cell living biosensor or bioreporter about 15 years ago, construction and testing of new genetically modified microorganisms for environmental sensing and reporting has proceeded at an ever increasing rate. One and a half decades appear as a reasonable time span for a new technology to reach the maturity needed for application and commercial success. It seems, however, that the research into cellular biosensors is still mostly in a proof-of-principle or demonstration phase and not close to extensive or commercial use outside of academia. In this review, we consider the motivations for bioreporter developments and discuss the suitability of extant bioreporters for the proposed applications to stimulate complementary research and to help researchers to develop realistic objectives. This includes the identification of some popular misconceptions about the qualities and shortcomings of bioreporters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nanotechnology is becoming part of our daily life in a wide range of products such as computers, bicycles, sunscreens or nanomedicines. While these applications already become reality, considerable work awaits scientists, engineers, and policy makers, who want such nanotechnological products to yield a maximum of benefit at a minimum of social, environmental, economic and (occupational) health cost. Considerable efforts for coordination and collaboration in research are needed if one wants to reach these goals in a reasonable time frame and an affordable price tag. This is recognized in Europe by the European Commission which funds not only research projects but also supports the coordination of research efforts. One of these coordination efforts is NanoImpactNet, a researcher-operated network, which started in 2008 promote scientific cross-talk across all disciplines on the health and environmental impact of nanomaterials. Stakeholders contribute to these activities, notably the definition of research and knowledge needs. Initial discussions in this domain focused on finding an agreement on common metrics, and which elements are needed for standardized approaches for hazard and exposure identification. There are many nanomaterial properties that may play a role. Hence, to gain the time needed to study this complex matter full of uncertainties, researchers and stakeholders unanimously called for simple, easy and fast risk assessment tools that can support decision making in this rapidly moving and growing domain. Today, several projects are starting or already running that will develop such assessment tools. At the same time, other projects investigate in depth which factors and material properties can lead to unwanted toxicity or exposure, what mechanisms are involved and how such responses can be predicted and modelled. A vision for the future is that once these factors, properties and mechanisms are understood, they can and will be accounted for in the development of new products and production processes following the idea of "Safety by Design". The promise of all these efforts is a future with nanomaterials where most of their risks are recognized and addressed before they even reach the market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Parallel T-Coffee (PTC) was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results: In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions: The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vuosi vuodelta kasvava tietokoneiden prosessointikyky on mahdollistanut harmaataso- ja RGB-värikuvia tarkempien spektrikuvien käsittelyn järjellisessä ajassa ilman suuria kustannuksia. Ongelmana on kuitenkin, ettei talletus- ja tiedonsiirtomedia ole kehittynyt prosessointikyvyn vauhdissa. Ratkaisu tähän ongelmaan on spektrikuvien tiivistäminen talletuksen ja tiedonsiirron ajaksi. Tässä työssä esitellään menetelmä, jossa spektrikuva tiivistetään kahdessa vaiheessa: ensin ryhmittelemällä itseorganisoituvan kartan (SOM) avulla ja toisessa vaiheessa jatketaan tiivistämistä perinteisin menetelmin. Saadut tiivistyssuhteet ovat merkittäviä vääristymän pysyessä siedettävänä. Työ on tehty Lappeenrannan teknillisen korkeakoulun Tietotekniikan osaston Tietojenkäsittelytekniikan tutkimuslaboratoriossa osana laajempaa kuvantiivistyksen tutkimushanketta.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Työn tavoitteena oli selvittää konetekniikan osastolla valmistetun optimointiohjelman soveltuvuutta virtuaaliprototyyppisen optimointiin. Lisäksi työn tavoitteena oli selvittää virtuaaliprototyyppien avulla tapahtuvan optimoinnin rajoitteet ja mahdollisuudet todellisilla optimointitehtävillä. Optimaze-ohjelma yhdistettiin simulointiohjelmistoon käyttäen apuna merkkitiedostoja ja simulointiohjelmiston sisäisiä makroja. Saadun optimointiympäristön toimivuus testattiin kahdella todellista puomia optimoivalla optimointitehtävällä. Simulointiohjelmistona käytettiin ADAMS:ia ja optimointialgoritmina differentiaalievoluutiota. Tuloksista havaittiin optimointiohjelman soveltuvan virtuaaliprototyyppien optimointiin. Raskaiden mallien optimoinnin huomattiin kuitenkin olevan liian hidas prosessi. Tutkimuksessa todettiinkin asian vaativan lisää tutkimista ja kehitystyötä.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents different aspects of Web Services usage in Symbian OS that is an operating system for handheld devices. The practical part of the work was to develop Symbian OS client application for Web Services. It produced four reusable software components. XML enables platform and programming language independent services. Web Services use XML to create standardized message oriented services that are accessed through HTTP. Web Services are moving towards dynamic B2B interaction. Web Services increases the amount of transferred data, which is not desirable in mobile networks where transfer speed is slower than in the traditional networks. However the modern mobile networks are able to transfer the extra payload with reasonable time. XML processing is not a big problem. Web Services can be accessed from the modern mobile devices and they can cut down the development costs.