970 resultados para Distributed task scheduling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a guide for librarians, library policy makers and the local level, community leaders, local and state policy makers, and library customers across the state, these recommendations create a vision of libraries as friendly, welcoming places where Iowans can access inform ation in person or on-lin e, ob tain, an d use ideas and truste d info rmatio n tha t will enhance their quality of life. This report specifies the steps to achieving this vision and creates an environment of opportunity to m ove s teadily toward the new system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Osteoporosis is a serious worldwide epidemic. FRAX® is a web-based tool developed by the Sheffield WHO Collaborating Center team, that integrates clinical risk factors and femoral neck BMD and calculates the 10 year fracture probability in order to help health care professionals identify patients who need treatment. However, only 31 countries have a FRAX® calculator. In the absence of a FRAX® model for a particular country, it has been suggested to use a surrogate country for which the epidemiology of osteoporosis most closely approximates the index country. More specific recommendations for clinicians in these countries are not available. In North America, concerns have also been raised regarding the assumptions used to construct the US ethnic specific FRAX® calculators with respect to the correction factors applied to derive fracture probabilities in Blacks, Asians and Hispanics in comparison to Whites. In addition, questions were raised about calculating fracture risk in other ethnic groups e.g., Native Americans and First Canadians. The International Society for Clinical Densitometry (ISCD) in conjunction with the International Osteoporosis Foundation (IOF) assembled an international panel of experts that ultimately developed joint Official Positions of the ISCD and IOF advising clinicians regarding FRAX® usage. As part of the process, the charge of the FRAX® International Task Force was to review and synthesize data regarding geographic and race/ethnic variability in hip fractures, non-hip osteoporotic fractures, and make recommendations about the use of FRAX® in ethnic groups and countries without a FRAX® calculator. This synthesis was presented to the expert panel and constitutes the data on which the subsequent Official Positions are predicated. A summary of the International Task Force composition and charge is presented here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En aquest projecte he avaluat un seguit de plataformes per veure quina era la millor pertal d’integrar les eines que proporcionen serveis del projecte TENCompetence.Per començar el projecte plantejaré el context del projecte. Com se situa al marc del projecte TENCompetence on he desenvolupat aquest treball fi de carrera. Tot seguit es veuen quines eines disposem per tal d’accedir als diferents serveis que ens proporciona el projecte.Comento els escenaris on s’aplicarà la tecnologia que triem i finalment comento les diferents plataformes web on integrarem les diferents eines.A continuació he realitzat un capítol per tal de comentar l’anàlisi de requeriments del’escenari d’aplicació de cada pilot. Per a cada escenari aplico unes determinades eines a un determinat context, i per tant hi han unes necessitats concretes que he de recollir. Per plasmar-ho en paper he realitzat l’anàlisi de requeriments. Un cop recollides totes les dades he pogut feruna selecció de la plataforma contenidora que més s’escau a cada pilot.Amb els requeriments i la plataforma seleccionada, he realitzat un disseny per a cada pilot. Després de refinar el disseny he realitzat la implementació per tal de cobrir les necessitats dels pilots. També he aprofitat per veure quina tecnologia es pot utilitzar per tal d’integrar leseines dins de la plataforma.Amb la implementació feta he realitzat un seguit de proves per tal de veure els resultats aconseguits. Tot seguit he iniciat un procés iteractiu per tal refinar el disseny i millorar la implementació.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last several years, lawmakers have been responding to several highly publicized child abduction, assault and murder cases. While such cases remain rare in Iowa, the public debates they have generated are having far-reaching effects. Policy makers are responsible for controlling the nature of such effects. Challenges they face stem from the need to avoid primarily politically-motivated responses and the desire to make informed decisions that recognize both the strengths and the limitations of the criminal justice system as a vehicle for promoting safe and healthy families and communities. Consensus was reached by the Task Force at its first meeting that one of its standing goals is to provide nonpartisan guidance to help avoid or fix problematic sex offense policies and practices. Setting this goal was a response to the concern over what can result from elected officials’ efforts to respond to the types of sex offender-related concerns that can easily become emotionally laden and politically charged due to the universally held abhorrence of sex crimes against children. The meetings of the Task Force and the various work groups it has formed have included some spirited and perhaps emotionally charged discussions, despite the above-stated ground rule. However, as is described in the report, the Task Force’s first set of recommendations and plans for further study were approved through consensus. It is hoped that in upcoming legislative deliberations, it will be remembered that the non-legislative members of the Task Force all agreed on the recommendations contained in this report. The topics discussed in this first report from the Task Force are limited to the study issues specifically named in H.F. 619, the Task Force’s enabling legislation. However, other topics of concern were discussed by the Task Force because of their immediacy or because of their possible relationships with one or more of the Task Force’s mandated study issues. For example, it has been reported by some probation/parole officers and others that the 2000 feet rule has had a negative influence on treatment participation and supervision compliance. While such concerns were noted, the Task Force did not take it upon itself to investigate them at this time and thus broaden the agenda it was given by the General Assembly last session. As a result, the recently reinstated 2000 feet rule, the new cohabitation/child endangerment law and other issues of interest to Task Force members but not within the scope of their charge are not discussed in the body of this report. An issue of perhaps the greatest interest to most Task Force members that was not a part of their charge was a belief in the benefit of viewing Iowa’s efforts to protect children from sex crimes with as comprehensive a platform as possible. It has been suggested that much more can be done to prevent child-victim sex crimes than would be accomplished by only concentrating on what to do with offenders after a crime has occurred. To prevent child victimization, H.F. 619 policy provisions rely largely on incapacitation and future deterrent effects of increased penalties, more restrictive supervision practices and greater public awareness of the risk presented by a segment of Iowa’s known sex offenders. For some offenders, these policies will no doubt prevent future sex crimes against children, and the Task Force has begun long-term studies to look for the desired results and for ways to improve such results through better supervision tools and more effective offender treatment. Unfortunately, much of the effects from the new policies may primarily influence persons who have already committed sex offenses against minors and who have already been caught doing so. Task Force members discussed the need for a range of preventive efforts and a need to think about sex crimes against children from other than just a “reaction- to-the-offender” perspective. While this topic is not addressed in the report that follows, it was suggested that some of the Task Force’s discussions could be briefly shared through these opening comments. Along with incapacitation and deterrence, comprehensive approaches to the prevention of child-victim sex crimes would also involve making sure parents have the tools they need to detect signs of adults with sex behavior problems, to help teach their children about warning signs and to find the support they need for healthy parenting. School, faithbased and other community organizations might benefit from stronger supports and better tools they can use to more effectively promote positive youth development and the learning of respect for others, respect for boundaries and healthy relationships. All of us who have children, or who live in communities where there are children, need to understand the limitations of our justice system and the importance of our own ability to play a role in preventing sexual abuse and protecting children from sex offenders, which are often the child’s own family members. Over 1,000 incidences of child sexual abuse are confirmed or founded each year in Iowa, and most such acts take place in the child’s home or the residence of the caretaker of the child. Efforts to prevent child sexual abuse and to provide for early interventions with children and families at risk could be strategically examined and strengthened. The Sex Offender Treatment and Supervision Task Force was established to provide assistance to the General Assembly. It will respond to legislative direction for adjusting its future plans as laid out in this report. Its plans could be adjusted to broaden or narrow its scope or to assign different priority levels of effort to its current areas of study. Also, further Task Force considerations of the recommendations it has already submitted could be called for. In the meantime, it is hoped that the information and recommendations submitted through this report prove helpful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The project presented, iCognos, consists of a flexible platform to assist end-users in performing a series of mental tasks with a sensitized mobile telerobotic platform aimed at mitigating the problems associated to cognitive disorders with an ecological cognition approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multisensory experiences influence subsequent memory performance and brain responses. Studies have thus far concentrated on semantically congruent pairings, leaving unresolved the influence of stimulus pairing and memory sub-types. Here, we paired images with unique, meaningless sounds during a continuous recognition task to determine if purely episodic, single-trial multisensory experiences can incidentally impact subsequent visual object discrimination. Psychophysics and electrical neuroimaging analyses of visual evoked potentials (VEPs) compared responses to repeated images either paired or not with a meaningless sound during initial encounters. Recognition accuracy was significantly impaired for images initially presented as multisensory pairs and could not be explained in terms of differential attention or transfer of effects from encoding to retrieval. VEP modulations occurred at 100-130ms and 270-310ms and stemmed from topographic differences indicative of network configuration changes within the brain. Distributed source estimations localized the earlier effect to regions of the right posterior temporal gyrus (STG) and the later effect to regions of the middle temporal gyrus (MTG). Responses in these regions were stronger for images previously encountered as multisensory pairs. Only the later effect correlated with performance such that greater MTG activity in response to repeated visual stimuli was linked with greater performance decrements. The present findings suggest that brain networks involved in this discrimination may critically depend on whether multisensory events facilitate or impair later visual memory performance. More generally, the data support models whereby effects of multisensory interactions persist to incidentally affect subsequent behavior as well as visual processing during its initial stages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inhibitory control refers to the ability to suppress planned or ongoing cognitive or motor processes. Electrophysiological indices of inhibitory control failure have been found to manifest even before the presentation of the stimuli triggering the inhibition, suggesting that pre-stimulus brain-states modulate inhibition performance. However, previous electrophysiological investigations on the state-dependency of inhibitory control were based on averaged event-related potentials (ERPs), a method eliminating the variability in the ongoing brain activity not time-locked to the event of interest. These studies thus left unresolved whether spontaneous variations in the brain-state immediately preceding unpredictable inhibition-triggering stimuli also influence inhibitory control performance. To address this question, we applied single-trial EEG topographic analyses on the time interval immediately preceding NoGo stimuli in conditions where the responses to NoGo trials were correctly inhibited [correct rejection (CR)] vs. committed [false alarms (FAs)] during an auditory spatial Go/NoGo task. We found a specific configuration of the EEG voltage field manifesting more frequently before correctly inhibited responses to NoGo stimuli than before FAs. There was no evidence for an EEG topography occurring more frequently before FAs than before CR. The visualization of distributed electrical source estimations of the EEG topography preceding successful response inhibition suggested that it resulted from the activity of a right fronto-parietal brain network. Our results suggest that the fluctuations in the ongoing brain activity immediately preceding stimulus presentation contribute to the behavioral outcomes during an inhibitory control task. Our results further suggest that the state-dependency of sensory-cognitive processing might not only concern perceptual processes, but also high-order, top-down inhibitory control mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report #02-04 Concerns about the death of two Iowa inmates at the Anamosa State Penitentiary, asked the Ombudsman Office to review the incidents and provide an assessment of each incident. The Governor also asked the Ombudsman Office to propose a set of recommendations for improving inmate and staff safety within Anamosa State Penitentiary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The integrity of the cornea, the most anterior part of the eye, is indispensable for vision. Forty-five million individuals worldwide are bilaterally blind and another 135 million have severely impaired vision in both eyes because of loss of corneal transparency; treatments range from local medications to corneal transplants, and more recently to stem cell therapy. The corneal epithelium is a squamous epithelium that is constantly renewing, with a vertical turnover of 7 to 14 days in many mammals. Identification of slow cycling cells (label-retaining cells) in the limbus of the mouse has led to the notion that the limbus is the niche for the stem cells responsible for the long-term renewal of the cornea; hence, the corneal epithelium is supposedly renewed by cells generated at and migrating from the limbus, in marked opposition to other squamous epithelia in which each resident stem cell has in charge a limited area of epithelium. Here we show that the corneal epithelium of the mouse can be serially transplanted, is self-maintained and contains oligopotent stem cells with the capacity to generate goblet cells if provided with a conjunctival environment. Furthermore, the entire ocular surface of the pig, including the cornea, contains oligopotent stem cells (holoclones) with the capacity to generate individual colonies of corneal and conjunctival cells. Therefore, the limbus is not the only niche for corneal stem cells and corneal renewal is not different from other squamous epithelia. We propose a model that unifies our observations with the literature and explains why the limbal region is enriched in stem cells.