845 resultados para Task constraints
Resumo:
A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.
Resumo:
Optical coherence tomography (OCT) is a well-established image modality in ophthalmology and used daily in the clinic. Automatic evaluation of such datasets requires an accurate segmentation of the retinal cell layers. However, due to the naturally low signal to noise ratio and the resulting bad image quality, this task remains challenging. We propose an automatic graph-based multi-surface segmentation algorithm that internally uses soft constraints to add prior information from a learned model. This improves the accuracy of the segmentation and increase the robustness to noise. Furthermore, we show that the graph size can be greatly reduced by applying a smart segmentation scheme. This allows the segmentation to be computed in seconds instead of minutes, without deteriorating the segmentation accuracy, making it ideal for a clinical setup. An extensive evaluation on 20 OCT datasets of healthy eyes was performed and showed a mean unsigned segmentation error of 3.05 ±0.54 μm over all datasets when compared to the average observer, which is lower than the inter-observer variability. Similar performance was measured for the task of drusen segmentation, demonstrating the usefulness of using soft constraints as a tool to deal with pathologies.
Resumo:
We derive multiscale statistics for deconvolution in order to detect qualitative features of the unknown density. An important example covered within this framework is to test for local monotonicity on all scales simultaneously. We investigate the moderately ill-posed setting, where the Fourier transform of the error density in the deconvolution model is of polynomial decay. For multiscale testing, we consider a calibration, motivated by the modulus of continuity of Brownian motion. We investigate the performance of our results from both the theoretical and simulation based point of view. A major consequence of our work is that the detection of qualitative features of a density in a deconvolution problem is a doable task, although the minimax rates for pointwise estimation are very slow.
Resumo:
With more experience in the labor market, some job characteristics increase, some decrease. For example, among young employees who just entered the labor market, job control may initially be low but increase with more routine and experience. Job control is a job resource that is valued in itself and is positively associated with job satisfaction; but job control also helps dealing with stressors at work. There is little research on correlated changes, but the existing evidence suggests a joint development over time. However, even less is known about the relevance of such changes for employees. Usually, research tends to use mean levels to predict mean levels in outcomes but development in job control and stressors may be as relevant for job satisfaction as having a certain level in those job characteristics. Job satisfaction typically is regarded as a positive attitude towards one’s work. What has received less attention is that some employees may lower their expectations if their job situation does not reflect their needs, resulting in a resigned attitude towards one’s job. The present study investigates the development of job control and task-related stressors over ten years and tests the predictive value of changes in job control and task-related stressors for resigned attitude towards one’s job. We used data from a Swiss panel study (N=356) ranging over ten years. Job control, task-related stressors (an index consisting of time pressure, concentration demands, performance constraints, interruptions, and uncertainty about tasks), and resigned attitude towards one’s job were assessed in 1998, 1999, 2001, and 2008. Latent growth modeling revealed that growth rates of job control and task-related stressors were not correlated with one another. We predicted resigned attitude towards one’s job in 2008 a) by initial levels, and b) by changes in job control and stressors, controlling for resigned attitude in 1998. There was some prediction by initial levels (job control: β = -.15, p < .05; task-related stressors: β = .12, p = .06). However, as expected, changes in control and stressors predicted resigned attitude much better, with β = -.37, p < .001, for changes in job control, and β = .31, p < .001, for changes in task-related stressors. Our data confirm the importance of having low levels of task-related stressors and higher levels of job control for job attitudes. However, development in these job characteristics seems even more important than initial levels.
Resumo:
Several theories assume that successful team coordination is partly based on knowledge that helps anticipating individual contributions necessary in a situational task. It has been argued that a more ecological perspective needs to be considered in contexts evolving dynamically and unpredictably. In football, defensive plays are usually coordinated according to strategic concepts spanning all members and large areas of the playfield. On the other hand, fewer people are involved in offensive plays as these are less projectable and strongly constrained by ecological characteristics. The aim of this study is to test the effects of ecological constraints and player knowledge on decision making in offensive game scenarios. It is hypothesized that both knowledge about team members and situational constraints will influence decisional processes. Effects of situational constraints are expected to be of higher magnitude. Two teams playing in the fourth league of the Swiss Football Federation participate in the study. Forty customized game scenarios were developed based on the coaches’ information about player positions and game strategies. Each player was shown in ball possession four times. Participants were asked to take the perspective of the player on the ball and to choose a passing destination and a recipient. Participants then rated domain specific strengths (e.g., technical skills, game intelligence) of each of their teammates. Multilevel models for categorical dependent variables (team members) will be specified. Player knowledge (rated skills) and ecological constraints (operationalized as each players’ proximity and availability for ball reception) are included as predictor variables. Data are currently being collected. Results will yield effects of parameters that are stable across situations as well as of variable parameters that are bound to situational context. These will enable insight into the degree to which ecological constraints and more enduring team knowledge are involved in decisional processes aimed at coordinating interpersonal action.
Resumo:
The Task Force was charged with exploring the causes and consequences of bullying in schools in this State, identifying promising practices that reduce incidences of bullying, highlighting training and technical assistance opportunities for schools to effectively address bullying, evaluating the effectiveness of schools current anti-bullying policies and other bullying prevention programs, and other related issues. Under tight time constraints, the Task Force has met that change. The Illinois State Board of Education will post this report on its webpage devoted to the Task Force (http://www.isbe.net/SBPTF/default.htm). Moreover in the near future, the Task Force will produce an Executive Summary of this report which the ISBE will also include on the Task Force webpage.
Resumo:
Based on the observation that bimanual finger tapping movements tend toward mirror symmetry with respect to the body midline, despite the synchronous activation of non-homologous muscles, F. Mechsner, D. Kerzel, G. Knoblich, and W. Prinz (2001) [Perceptual basis of bimanual coordination. Nature, 414, 69-73] suggested that the basis of rhythmic coordination is purely spatial/perceptual in nature, and independent of the neuro-anatomical constraints of the motor system. To investigate this issue further, we employed a four finger tapping task similar to that used by F. Mechsner and G. Knoblich (2004) [Do muscle matter in bimanual coordination? Journal of Experimental Psychology: Human Perception and Performance, 30, 490-503] in which six male participants were required to alternately tap combinations of adjacent pairs of index (1), middle (M) and ring (R) fingers of each hand in time with an auditory metronome. The metronome pace increased continuously from 1 Hz to 3 Hz over the course of a 30-s trial. Each participant performed three blocks of trials in which finger combination for each hand (IM or MR) and mode of coordination (mirror or parallel) were presented in random order. Within each block, the right hand was placed in one of three orientations; prone, neutral and supine. The order of blocks was counterbalanced across the six participants. The left hand maintained a prone position throughout the experiment. On the basis of discrete relative phase analyses between synchronised taps, the time at which the initial mode of coordination was lost was determined for each trial. When the right hand was prone, transitions occurred only from parallel symmetry to mirror symmetry, regardless of finger combination. In contrast, when the right hand was supine, transitions occurred only from mirror symmetry to parallel but no transitions were observed in the opposite direction. In the right hand neutral condition, mirror and parallel symmetry are insufficient to describe the modes of coordination since the hands are oriented orthogonally. When defined anatomically, however, the results in each of the three right hand orientations are consistent. That is, synchronisation of finger tapping is deter-mined by a hierarchy of control of individual fingers based on their intrinsic neuro-mechanical properties rather than on the basis of their spatial orientation. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
This thesis is about the development of public debt and deficit in the eurozone, which has been in the center of attention for much of the new millennium. The debt-to-GDP and deficit-to-GDP ratios have changed significantly during the period of the European monetary integration, with sharp increases in the levels since the beginning of the financial crisis. We view the levels both before and after the establishment of the European Central Bank. The subject is complemented by a study of the restrictions on fiscal policy in the eurozone. The thesis begins with a review of the most central agreements in the Economic and Monetary Union, namely the Maastricht Treaty, the Stability and Growth Pact and the Fiscal Compact. We study the instructions and requirements provided by these contracts, with the emphasis being on the debt and deficit values. Furthermore, we view two theories that aim to provide us with information, whether the fiscal restrictions are useful or not. The second and empirical part consists of review on the debt and deficit levels in practice. We take a close look on the values for each of the currency union members. The third and last part summarizes the findings, and analyzes the reasons behind the changes. The result of the thesis is, that even though the levels of public debt and deficit have worsened since the beginning of the financial crisis, tight rules on fiscal policy might not be the best possible solution. Private sector has played a crucial part in the increase of the debt levels, and tight rules have their impact on the long awaited economic growth in the eurozone. It is obvious, though, that some form of fiscal guidelines with scientific ground are needed in order to avoid excessive and harmful debt and deficit levels. The main task is to make these guidelines a more essential part of the fiscal policy in each of the member countries.
Resumo:
Motion planning, or trajectory planning, commonly refers to a process of converting high-level task specifications into low-level control commands that can be executed on the system of interest. For different applications, the system will be different. It can be an autonomous vehicle, an Unmanned Aerial Vehicle(UAV), a humanoid robot, or an industrial robotic arm. As human machine interaction is essential in many of these systems, safety is fundamental and crucial. Many of the applications also involve performing a task in an optimal manner within a given time constraint. Therefore, in this thesis, we focus on two aspects of the motion planning problem. One is the verification and synthesis of the safe controls for autonomous ground and air vehicles in collision avoidance scenarios. The other part focuses on the high-level planning for the autonomous vehicles with the timed temporal constraints. In the first aspect of our work, we first propose a verification method to prove the safety and robustness of a path planner and the path following controls based on reachable sets. We demonstrate the method on quadrotor and automobile applications. Secondly, we propose a reachable set based collision avoidance algorithm for UAVs. Instead of the traditional approaches of collision avoidance between trajectories, we propose a collision avoidance scheme based on reachable sets and tubes. We then formulate the problem as a convex optimization problem seeking control set design for the aircraft to avoid collision. We apply our approach to collision avoidance scenarios of quadrotors and fixed-wing aircraft. In the second aspect of our work, we address the high level planning problems with timed temporal logic constraints. Firstly, we present an optimization based method for path planning of a mobile robot subject to timed temporal constraints, in a dynamic environment. Temporal logic (TL) can address very complex task specifications such as safety, coverage, motion sequencing etc. We use metric temporal logic (MTL) to encode the task specifications with timing constraints. We then translate the MTL formulae into mixed integer linear constraints and solve the associated optimization problem using a mixed integer linear program solver. We have applied our approach on several case studies in complex dynamical environments subjected to timed temporal specifications. Secondly, we also present a timed automaton based method for planning under the given timed temporal logic specifications. We use metric interval temporal logic (MITL), a member of the MTL family, to represent the task specification, and provide a constructive way to generate a timed automaton and methods to look for accepting runs on the automaton to find an optimal motion (or path) sequence for the robot to complete the task.
Resumo:
Reconfigurable hardware can be used to build multi tasking systems that dynamically adapt themselves to the requirements of the running applications. This is especially useful in embedded systems, since the available resources are very limited and the reconfigurable hardware can be reused for different applications. In these systems computations are frequently represented as task graphs that are executed taking into account their internal dependencies and the task schedule. The management of the task graph execution is critical for the system performance. In this regard, we have developed two dif erent versions, a software module and a hardware architecture, of a generic task-graph execution manager for reconfigurable multi-tasking systems. The second version reduces the run-time management overheads by almost two orders of magnitude. Hence it is especially suitable for systems with exigent timing constraints. Both versions include specific support to optimize the reconfiguration process.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
We present a re-analysis of the Geneva-Copenhagen survey, which benefits from the infrared flux method to improve the accuracy of the derived stellar effective temperatures and uses the latter to build a consistent and improved metallicity scale. Metallicities are calibrated on high-resolution spectroscopy and checked against four open clusters and a moving group, showing excellent consistency. The new temperature and metallicity scales provide a better match to theoretical isochrones, which are used for a Bayesian analysis of stellar ages. With respect to previous analyses, our stars are on average 100 K hotter and 0.1 dex more metal rich, which shift the peak of the metallicity distribution function around the solar value. From Stromgren photometry we are able to derive for the first time a proxy for [alpha/Fe] abundances, which enables us to perform a tentative dissection of the chemical thin and thick disc. We find evidence for the latter being composed of an old, mildly but systematically alpha-enhanced population that extends to super solar metallicities, in agreement with spectroscopic studies. Our revision offers the largest existing kinematically unbiased sample of the solar neighbourhood that contains full information on kinematics, metallicities, and ages and thus provides better constraints on the physical processes relevant in the build-up of the Milky Way disc, enabling a better understanding of the Sun in a Galactic context.
Resumo:
We discuss the dynamics of the Universe within the framework of the massive graviton cold dark matter scenario (MGCDM) in which gravitons are geometrically treated as massive particles. In this modified gravity theory, the main effect of the gravitons is to alter the density evolution of the cold dark matter component in such a way that the Universe evolves to an accelerating expanding regime, as presently observed. Tight constraints on the main cosmological parameters of the MGCDM model are derived by performing a joint likelihood analysis involving the recent supernovae type Ia data, the cosmic microwave background shift parameter, and the baryonic acoustic oscillations as traced by the Sloan Digital Sky Survey red luminous galaxies. The linear evolution of small density fluctuations is also analyzed in detail. It is found that the growth factor of the MGCDM model is slightly different (similar to 1-4%) from the one provided by the conventional flat Lambda CDM cosmology. The growth rate of clustering predicted by MGCDM and Lambda CDM models are confronted to the observations and the corresponding best fit values of the growth index (gamma) are also determined. By using the expectations of realistic future x-ray and Sunyaev-Zeldovich cluster surveys we derive the dark matter halo mass function and the corresponding redshift distribution of cluster-size halos for the MGCDM model. Finally, we also show that the Hubble flow differences between the MGCDM and the Lambda CDM models provide a halo redshift distribution departing significantly from the those predicted by other dark energy models. These results suggest that the MGCDM model can observationally be distinguished from Lambda CDM and also from a large number of dark energy models recently proposed in the literature.
Resumo:
We discuss the properties of homogeneous and isotropic flat cosmologies in which the present accelerating stage is powered only by the gravitationally induced creation of cold dark matter (CCDM) particles (Omega(m) = 1). For some matter creation rates proposed in the literature, we show that the main cosmological functions such as the scale factor of the universe, the Hubble expansion rate, the growth factor, and the cluster formation rate are analytically defined. The best CCDM scenario has only one free parameter and our joint analysis involving baryonic acoustic oscillations + cosmic microwave background (CMB) + SNe Ia data yields (Omega) over tilde = 0.28 +/- 0.01 (1 sigma), where (Omega) over tilde (m) is the observed matter density parameter. In particular, this implies that the model has no dark energy but the part of the matter that is effectively clustering is in good agreement with the latest determinations from the large- scale structure. The growth of perturbation and the formation of galaxy clusters in such scenarios are also investigated. Despite the fact that both scenarios may share the same Hubble expansion, we find that matter creation cosmologies predict stronger small scale dynamics which implies a faster growth rate of perturbations with respect to the usual Lambda CDM cosmology. Such results point to the possibility of a crucial observational test confronting CCDM with Lambda CDM scenarios through a more detailed analysis involving CMB, weak lensing, as well as the large-scale structure.