972 resultados para real-effort task
Resumo:
A través de algunos relatos en torno al espacio se ensaya un discurso sobre la conexión entre arte y vida, que postula la relación entre escenas, la continuidad entre arte y lugar, y el entrelazamiento creativo y narrativo entre disciplinas artísticas. El artículo pretende registrar la conexión, en ambos sentidos del vector, entre lugares y proyectos, un vínculo inasible pero real. Todo escritor persigue las marcas de la actividad humana sobre los objetos, la probatura de una teoría de los objetos inanimados que registramos desde nuestras herramientas de analogía y contraste antropométrico. Ése es también el empeño del arte y ésa era la definición del propio origen de la pintura de Plinio, el intento de retener la compañía de la persona amada dibujando la sombra de su perfil sobre una superficie, convertida en un museo espontáneo al fresco. Las obras de arte, como algunos proyectos, deberían ser hilos, en el tiempo y el espacio, enlaces del arte y la vida: entre nuestra mesa y el territorio, entre la mente y la materia, entre el hombre y el mundo.
Resumo:
This Task Force report combines the most recent data from Eurostat with national sources to highlight the most significant labour mobility trends within the EU. Overall, the recent recession has not induced previously immobile workers to become more mobile, at least not in the larger member states. Mobility flows have moved away from crisis countries in response to the economic downturn but the desired increase in south-north mobility has not been observed so far. This leads the authors to conclude that successfully fostering mobility within EU15 countries requires tremendous effort. It is important that workers who are willing and able to move are not discouraged from doing so by unnecessary barriers to mobility. Improving the workings of the EURES system and its online job-matching platform; better cooperation of national employment agencies; streamlining the recognition of qualifications; and supporting language training within the EU are important contributions to labour mobility. The authors conclude that the EU is right to defend the free movement of workers. National governments should keep in mind that their ability to tap into an attractive foreign labour supply also hinges upon the perception of how mobile workers are treated in destination countries. If the political imperative requires regulations to be changed, such as the one guiding the coordination of social security, it is essential that no new mobility barriers are put in place.
Resumo:
In December 2014, ECMI and CEPS formed the European Capital Markets Expert Group (ECMEG) with the aim of providing a long-term contribution to the debate on the Capital Markets Union (CMU) project, proposed by the European Commission. After an intensive, year-long research effort and in-depth discussions with ECMEG members, this final report aims to rethink financial integration policies in the European Union and to devise an EU-wide plan to remove the barriers to greater capital markets integration. It offers a methodology to identify and prioritise cross-border barriers to capital markets integration and provides a set of policy recommendations to improve its key components: price discovery, execution and enforcement of capital markets transactions.
Resumo:
This monograph begins with a case study that provides a means for analyzing the complexity of organizational leadership in the contemporary security environment. As such, it presents a high stakes problem-set that required an operational adaptation by a cavalry squadron conducting combat operations in Baghdad. This problematic reality triggered the struggle to find a creative response to a very deadly problem, while cultural norms served as barriers that prevented the rejection of previously accepted solutions that had proven successful in the past, even though those successful solutions no longer fit in the context of the reality of the present. The case study highlights leaders who were constrained by deeply-held assumptions that inhibited their ability to adapt quickly to a changed environment. The case study then moves on to provide an example of a successful application of adaptive leadership and adaptive work that was performed by the organization after a period of reflection and the willingness to experiment and assume risk. The case study serves as a microcosm of the challenges facing the U.S. Army, and the corresponding leadership framework presented in this monograph can be used as a model for the Army as it attempts to move forward in its effort to make adaptation an institutional imperative. The paper presents a more holistic approach to leadership where the leader transcends that of simply being an authority figure and becomes a real leader who provides a safe and creative learning environment where the organization can tackle and solve adaptive challenges. The paper concludes by recommending that U.S. Army leaders apply Harvard Professor Dean Williams's theory to the challenges confronting the Army's leader development process thereby fostering a culture of adaptive leaders.
Resumo:
As part of the Governor's effort to streamline State government through improvements in the efficiency and effectiveness of operations, Executive Order 2004-06 ("EO6") provided for the reorganization (consolidation) of the Department of Insurance, Office of Banks and Real Estate, Department of Professional Regulation and Department of Financial Institutions. Through EO6 the four predecessor Agencies were abolished and a single new agency, The Department of Financial and Professional Regulation (hereafter referred to as "IDFPR") was created. The purpose of the consolidation of the four regulatory agencies was to allow for certain economies of scale to be realized primarily within the executive management and administrative functions. Additionally, the consolidation would increases the effectiveness of operations through the integration of certain duplicative functions within the four predecessor agencies without the denegration of the frontline functions. Beginning on or about July 1, 2004, the IDFPR began consolidation activities focusing primarily on the administrative functions of Executive Management, Fiscal and Accounting, General Counsel, Human Resources, Information Technology and Other Administrative Services. The underlying premise of the reorganization was that all improvements could be accomplished without the denegration of the frontline functions of the predecessor agencies. Accordingly, all powers, duties, rights, responsibilities and functions of the predecessor agencies migrated to IDFPR and the reorganization activities commenced July 1, 2004.
Resumo:
The real-time refinement calculus is a formal method for the systematic derivation of real-time programs from real-time specifications in a style similar to the non-real-time refinement calculi of Back and Morgan. In this paper we extend the real-time refinement calculus with procedures and provide refinement rules for refining real-time specifications to procedure calls. A real-time specification can include constraints on, not only what outputs are produced, but also when they are produced. The derived programs can also include time constraints oil when certain points in the program must be reached; these are expressed in the form of deadline commands. Such programs are machine independent. An important consequence of the approach taken is that, not only are the specifications machine independent, but the whole refinement process is machine independent. To implement the machine independent code on a target machine one has a separate task of showing that the compiled machine code will reach all its deadlines before they expire. For real-time programs, externally observable input and output variables are essential. These differ from local variables in that their values are observable over the duration of the execution of the program. Hence procedures require input and output parameter mechanisms that are references to the actual parameters so that changes to external inputs are observable within the procedure and changes to output parameters are externally observable. In addition, we allow value and result parameters. These may be auxiliary parameters, which are used for reasoning about the correctness of real-time programs as well as in the expression of timing deadlines, but do not lead to any code being generated for them by a compiler. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Real-time software systems are rarely developed once and left to run. They are subject to changes of requirements as the applications they support expand, and they commonly outlive the platforms they were designed to run on. A successful real-time system is duplicated and adapted to a variety of applications - it becomes a product line. Current methods for real-time software development are commonly based on low-level programming languages and involve considerable duplication of effort when a similar system is to be developed or the hardware platform changes. To provide more dependable, flexible and maintainable real-time systems at a lower cost what is needed is a platform-independent approach to real-time systems development. The development process is composed of two phases: a platform-independent phase, that defines the desired system behaviour and develops a platform-independent design and implementation, and a platform-dependent phase that maps the implementation onto the target platform. The last phase should be highly automated. For critical systems, assessing dependability is crucial. The partitioning into platform dependent and independent phases has to support verification of system properties through both phases.
Resumo:
Fast Classification (FC) networks were inspired by a biologically plausible mechanism for short term memory where learning occurs instantaneously. Both weights and the topology for an FC network are mapped directly from the training samples by using a prescriptive training scheme. Only two presentations of the training data are required to train an FC network. Compared with iterative learning algorithms such as Back-propagation (which may require many hundreds of presentations of the training data), the training of FC networks is extremely fast and learning convergence is always guaranteed. Thus FC networks may be suitable for applications where real-time classification is needed. In this paper, the FC networks are applied for the real-time extraction of gene expressions for Chlamydia microarray data. Both the classification performance and learning time of the FC networks are compared with the Multi-Layer Proceptron (MLP) networks and support-vector-machines (SVM) in the same classification task. The FC networks are shown to have extremely fast learning time and comparable classification accuracy.
Resumo:
The importance of availability of comparable real income aggregates and their components to applied economic research is highlighted by the popularity of the Penn World Tables. Any methodology designed to achieve such a task requires the combination of data from several sources. The first is purchasing power parities (PPP) data available from the International Comparisons Project roughly every five years since the 1970s. The second is national level data on a range of variables that explain the behaviour of the ratio of PPP to market exchange rates. The final source of data is the national accounts publications of different countries which include estimates of gross domestic product and various price deflators. In this paper we present a method to construct a consistent panel of comparable real incomes by specifying the problem in state-space form. We present our completed work as well as briefly indicate our work in progress.
Resumo:
We propose a method for the timing analysis of concurrent real-time programs with hard deadlines. We divide the analysis into a machine-independent and a machine-dependent task. The latter takes into account the execution times of the program on a particular machine. Therefore, our goal is to make the machine-dependent phase of the analysis as simple as possible. We succeed in the sense that the machine-dependent phase remains the same as in the analysis of sequential programs. We shift the complexity introduced by concurrency completely to the machine-independent phase.
Resumo:
One factor that research suggests impedes positive contact between outgroup members is the experience of anxiety that can occur when anticipating negative consequences of such interactions. Research examining attitudes and behaviour towards same-sex attracted individuals indicates that this intergroup anxiety is particularly evident when the anticipated interaction involves members of the same gender. The current studies investigate the effect of timing of disclosure of a person’s same-sex attractions in an effort to identify a means of reducing this anxiety. Study 1 uses a hypothetical scenario to gain insight into participants’ stated preferences for early or delayed knowledge of a person’s sexual orientation. Results reveal an association between experiencing close contact with gay individuals of the same gender in real life (but not opposite gender), and a preference for early disclosure. Results from an experimental study concur with these findings. After a face-to-face interaction task with a confederate of the same gender, participants sit further from the confederate for the late disclosure condition when compared with the early disclosure and no disclosure control. Future studies investigating the interaction between timing of disclosure of same-sex attractions and the intimacy of disclosure (casual vs. intimate), are discussed.
Resumo:
The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.
Resumo:
We used magnetoencephalography (MEG) to examine the nature of oscillatory brain rhythms when passively viewing both illusory and real visual contours. Three stimuli were employed: a Kanizsa triangle; a Kanizsa triangle with a real triangular contour superimposed; and a control figure in which the corner elements used to form the Kanizsa triangle were rotated to negate the formation of illusory contours. The MEG data were analysed using synthetic aperture magnetometry (SAM) to enable the spatial localisation of task-related oscillatory power changes within specific frequency bands, and the time-course of activity within given locations-of-interest was determined by calculating time-frequency plots using a Morlet wavelet transform. In contrast to earlier studies, we did not find increases in gamma activity (> 30 Hz) to illusory shapes, but instead a decrease in 10–30 Hz activity approximately 200 ms after stimulus presentation. The reduction in oscillatory activity was primarily evident within extrastriate areas, including the lateral occipital complex (LOC). Importantly, this same pattern of results was evident for each stimulus type. Our results further highlight the importance of the LOC and a network of posterior brain regions in processing visual contours, be they illusory or real in nature. The similarity of the results for both real and illusory contours, however, leads us to conclude that the broadband (< 30 Hz) decrease in power we observed is more likely to reflect general changes in visual attention than neural computations specific to processing visual contours.
Resumo:
National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.