884 resultados para Time in Management and the Organisation
Resumo:
Time is embedded in any sensory experience: the movements of a dance, the rhythm of a piece of music, the words of a speaker are all examples of temporally structured sensory events. In humans, if and how visual cortices perform temporal processing remains unclear. Here we show that both primary visual cortex (V1) and extrastriate area V5/MT are causally involved in encoding and keeping time in memory and that this involvement is independent from low-level visual processing. Most importantly we demonstrate that V1 and V5/MT are functionally linked and temporally synchronized during time encoding whereas they are functionally independent and operate serially (V1 followed by V5/MT) while maintaining temporal information in working memory. These data challenge the traditional view of V1 and V5/MT as visuo-spatial features detectors and highlight the functional contribution and the temporal dynamics of these brain regions in the processing of time in millisecond range. The present project resulted in the paper entitled: 'How the visual brain encodes and keeps track of time' by Paolo Salvioni, Lysiann Kalmbach, Micah Murray and Domenica Bueti that is now submitted for publication to the Journal of Neuroscience.
Resumo:
Background: About 80% of patients with Crohn's disease (CD) require bowel resection and up to 65% will undergo a second resection within 10 years. This study reports clinical risk factors for resection surgery (RS) and repeat RS. Methods: Retrospective cohort study, using data from patients included in the Swiss Inflammatory Bowel Disease Cohort. Cox regression analyses were performed to estimate rates of initial and repeated RS. Results: Out of 1,138 CD cohort patients, 417 (36.6%) had already undergone RS at the time of inclusion. Kaplan-Meier curves showed that the probability of being free of RS was 65% after 10 years, 42% after 20 years, and 23% after 40 years. Perianal involvement (PA) did not modify this probability to a significant extent. The main adjusted risk factors for RS were smoking at diagnosis (hazard ratio (HR) = 1.33; p = 0.006), stricturing with vs. without PA (HR = 4.91 vs. 4.11; p < 0.001) or penetrating disease with vs. without PA (HR = 3.53 vs. 4.58; p < 0.001). The risk factor for repeat RS was penetrating disease with vs. without PA (HR = 3.17 vs. 2.24; p < 0.05). Conclusion: The risk of RS was confirmed to be very high for CD in our cohort. Smoking status at diagnosis, but mostly penetrating and stricturing diseases increase the risk of RS.
Resumo:
Coffee and cocoa represent the main sources of income for small farmers in the Northern Amazon Region of Ecuador. The provinces of Orellana and Sucumbios, as border areas, have benefited from investments made by many public and private institutions. Many of the projects carried out in the area have been aimed at energising the production of coffee and cocoa, strengthening the producers’ associations and providing commercialisation infrastructure. Improving the quality of life of this population threatened by poverty and high migration flows mainly from Colombia is a significant challenge. This paper presents research highlighting the importance of associative commercialisation to raising income from coffee and cocoa. The research draws on primary information obtained during field work, and from official information from the Ministry of Agriculture. The study presents an overview of current organisational structures, initiatives of associative commercialisation, stockpiling of infrastructure and ownership regimes, as well as estimates for both ‘robusta’ coffee and national cocoa production and income. The analysis of the main constraints presents different alternatives for the implementation of public land policies. These policies are aimed at mitigating the problems associated with the organisational structure of the producers, with processes of commercialisation and with environmental aspects, among others.
Resumo:
By the end of the 1970s, contaminated sites had emerged as one of the most complex and urgent environmental issues affecting industrialized countries. The authors show that small and prosperous Switzerland is no exception to the pervasive problem of sites contamination, the legacy of past practices in waste management having left some 38,000 contaminated sites throughout the country. This book outlines the problem, offering evidence that open and polycentric environmental decision-making that includes civil society actors is valuable. They propose an understanding of environmental management of contaminated sites as a political process in which institutions frame interactions between strategic actors pursuing sometimes conflicting interests. In the opening chapter, the authors describe the influences of politics and the power relationships between actors involved in decision-making in contaminated sites management, which they term a "wicked problem." Chapter Two offers a theoretical framework for understanding institutions and the environmental management of contaminated sites. The next five chapters present a detailed case study on environmental management and contaminated sites in Switzerland, focused on the Bonfol Chemical Landfill. The study and analysis covers the establishment of the landfill under the first generation of environmental regulations, its closure and early remediation efforts, and the gambling on the remediation objectives, methods and funding in the first decade of the 21st Century. The concluding chapter discusses the question of whether the strength of environmental regulations, and the type of interactions between public, private, and civil society actors can explain the environmental choices in contaminated sites management. Drawing lessons from research, the authors debate the value of institutional flexibility for dealing with environmental issues such as contaminated sites.
Resumo:
Abstract OBJECTIVE Check the relationship between the users' contact time in educational programs and self-care and knowledge variables in diabetes mellitus. METHOD A longitudinal study with a quantitative approach with the participation, in the initial phase, of 263 users linked to Basic Health Units in Belo Horizonte, Brazil during the years 2012 and 2013. The data were collected with respect to the total contact time of the users' participation in the educational program as regards knowledge and self-care in acquired diabetes mellitus. The data were analyzed using the Student t-test for comparison of means, considering a 0.05 significance level. RESULTS The final sample included 151 users. The analysis showed that the improvement in self-care scores was statistically higher during an educational intervention of eight hours or more (p-value <0.05). In relation to the scores for knowledge, there was a statistically significant improvement at the end of the educational program. It was not possible to identify a value for the contact time from which there was an increase in mean scores for the ability of knowledge. CONCLUSION To improve the effectiveness of the promotion of skills related to knowledge and self-care in diabetes mellitus, it is necessary to consider the contact time as a relevant factor of the educational program.
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
Adults of Cyclocephala distincta are flower visitors of Neotropical palms (Arecaceae) and commonly found in the Atlantic Forest of Pernambuco, Brazil. Males and females were collected in the wild and subjected to captive rearing and breeding. The egg hatching rate, the life cycle, longevity of immatures and adults, and oviposition parameters in captivity were analyzed. The average duration of the life cycle of C. distinctawas 108.2 days (n = 45). The egg stage lasted on average 10.9 days, and the egg-hatching rate was 73.9%. The immature stage lasted on average 93.4 days. The larvae stage exhibited negative phototaxis, and the size of their head capsules increased at a constant rate of 1.6 between instars, following Dyar's rule. The average duration of the first instar was 24.8 days (n = 88), whereas the second and third instars lasted for 17.2 (n = 76) and 40.4 (n = 74) days respectively, and survival rates were 21.6%, 86.4% and 97.4%. The pre-pupal stage was recorded, and pupal chambers were built before pupation. The average number of eggs laid per female was 15.5, the total reproductive period lasted for 3.3 days, and the total fertility was 81.2%. Adults that emerged in captivity exhibited an average longevity of 18.9 days. Adult C. distincta exhibited thanatosis behavior upon manipulation, a strategy observed for the first time in Cyclocephala.
Resumo:
INTRODUCTION: In November 2009, the "3rd Summit on Osteoporosis-Central and Eastern Europe (CEE)" was held in Budapest, Hungary. The conference aimed to tackle issues regarding osteoporosis management in CEE identified during the second CEE summit in 2008 and to agree on approaches that allow most efficient and cost-effective diagnosis and therapy of osteoporosis in CEE countries in the future. DISCUSSION: The following topics were covered: past year experience from FRAX® implementation into local diagnostic algorithms; causes of secondary osteoporosis as a FRAX® risk factor; bone turnover markers to estimate bone loss, fracture risk, or monitor therapies; role of quantitative ultrasound in osteoporosis management; compliance and economical aspects of osteoporosis; and osteoporosis and genetics. Consensus and recommendations developed on these topics are summarised in the present progress report. CONCLUSION: Lectures on up-to-date data of topical interest, the distinct regional provenances of the participants, a special focus on practical aspects, intense mutual exchange of individual experiences, strong interest in cross-border cooperations, as well as the readiness to learn from each other considerably contributed to the establishment of these recommendations. The "4th Summit on Osteoporosis-CEE" held in Prague, Czech Republic, in December 2010 will reveal whether these recommendations prove of value when implemented in the clinical routine or whether further improvements are still required.
Resumo:
Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.
Resumo:
Diffeomorphism-induced symmetry transformations and time evolution are distinct operations in generally covariant theories formulated in phase space. Time is not frozen. Diffeomorphism invariants are consequently not necessarily constants of the motion. Time-dependent invariants arise through the choice of an intrinsic time, or equivalently through the imposition of time-dependent gauge fixation conditions. One example of such a time-dependent gauge fixing is the Komar-Bergmann use of Weyl curvature scalars in general relativity. An analogous gauge fixing is also imposed for the relativistic free particle and the resulting complete set time-dependent invariants for this exactly solvable model are displayed. In contrast with the free particle case, we show that gauge invariants that are simultaneously constants of motion cannot exist in general relativity. They vary with intrinsic time.
Resumo:
The correct use of closed field chambers to determine N2O emissions requires defining the time of day that best represents the daily mean N2O flux. A short-term field experiment was carried out on a Mollisol soil, on which annual crops were grown under no-till management in the Pampa Ondulada of Argentina. The N2O emission rates were measured every 3 h for three consecutive days. Fluxes ranged from 62.58 to 145.99 ∝g N-N2O m-2 h-1 (average of five field chambers) and were negatively related (R² = 0.34, p < 0.01) to topsoil temperature (14 - 20 ºC). N2O emission rates measured between 9:00 and 12:00 am presented a high relationship to daily mean N2O flux (R² = 0.87, p < 0.01), showing that, in the study region, sampling in the mornings is preferable for GHG.
Resumo:
The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.