854 resultados para Management models and fashions
Resumo:
Feedlot consulting nutritionists were invited to participate in a survey of feedlot nutritional and management practices in Brazil. Thirty-one nutritionists completed the survey on a Web site that was designed for collection of survey data. The survey consisted of 94 questions that included general information (n = 10); commodity information (n = 12); and questions about the use of coproducts (n = 5), roughage source and levels (n = 5), finishing diet adaptation methods (n = 7), supplements and micronutrients (n = 8), feed mixers (n = 6), feeding management (n = 3), cattle management and type of cattle fed (n = 16), formulation practices (n = 17), information resources used for nutritional recommendations (n = 2), and 2 additional questions. One final question addressed the primary challenges associated with applying nutritional recommendations in practice. The number of animals serviced yearly by each nutritionist averaged 121,682 (minimum = 2,000; maximum = 1,500,000; mode = 120,000; total = 3,163,750). Twenty-two respondents (71%) worked with feedlots that feed less than 5,000 animals/yr. Labor, along with availability and precision of equipment, seemed to be the main challenges for the nutritionists surveyed. Most of the nutritionists surveyed used TDN as the primary energy unit for formulation. More than 50% of the clients serviced by the 31 nutritionists did not manage feed bunks to control the quantity of feed offered per pen, and 36.6% fed cattle more than 4 times daily. The NRC (1996) and Journal of Animal Science were the most used sources of information by these nutritionists. Overall, general practices and nutritional recommendations provided by the 31 nutritionists surveyed were fairly consistent. Present data should aid in development of new research, future National Research Council models, and recommendations for Brazilian feeding systems in which Bos indicus cattle predominate.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Logistics involves planning, managing, and organizing the flows of goods from the point of origin to the point of destination in order to meet some requirements. Logistics and transportation aspects are very important and represent a relevant costs for producing and shipping companies, but also for public administration and private citizens. The optimization of resources and the improvement in the organization of operations is crucial for all branches of logistics, from the operation management to the transportation. As we will have the chance to see in this work, optimization techniques, models, and algorithms represent important methods to solve the always new and more complex problems arising in different segments of logistics. Many operation management and transportation problems are related to the optimization class of problems called Vehicle Routing Problems (VRPs). In this work, we consider several real-world deterministic and stochastic problems that are included in the wide class of the VRPs, and we solve them by means of exact and heuristic methods. We treat three classes of real-world routing and logistics problems. We deal with one of the most important tactical problems that arises in the managing of the bike sharing systems, that is the Bike sharing Rebalancing Problem (BRP). We propose models and algorithms for real-world earthwork optimization problems. We describe the 3DP process and we highlight several optimization issues in 3DP. Among those, we define the problem related to the tool path definition in the 3DP process, the 3D Routing Problem (3DRP), which is a generalization of the arc routing problem. We present an ILP model and several heuristic algorithms to solve the 3DRP.
Resumo:
BACKGROUND Prophylactic measures are key components of dairy herd mastitis control programs, but some are only relevant in specific housing systems. To assess the association between management practices and mastitis incidence, data collected in 2011 by a survey among 979 randomly selected Swiss dairy farms, and information from the regular test day recordings from 680 of these farms was analyzed. RESULTS The median incidence of farmer-reported clinical mastitis (ICM) was 11.6 (mean 14.7) cases per 100 cows per year. The median annual proportion of milk samples with a composite somatic cell count (PSCC) above 200,000 cells/ml was 16.1 (mean 17.3) %. A multivariable negative binomial regression model was fitted for each of the mastitis indicators for farms with tie-stall and free-stall housing systems separately to study the effect of other (than housing system) management practices on the ICM and PSCC events (above 200,000 cells/ml). The results differed substantially by housing system and outcome. In tie-stall systems, clinical mastitis incidence was mainly affected by region (mountainous production zone; incidence rate ratio (IRR) = 0.73), the dairy herd replacement system (1.27) and farmers age (0.81). The proportion of high SCC was mainly associated with dry cow udder controls (IRR = 0.67), clean bedding material at calving (IRR = 1.72), using total merit values to select bulls (IRR = 1.57) and body condition scoring (IRR = 0.74). In free-stall systems, the IRR for clinical mastitis was mainly associated with stall climate/temperature (IRR = 1.65), comfort mats as resting surface (IRR = 0.75) and when no feed analysis was carried out (IRR = 1.18). The proportion of high SSC was only associated with hand and arm cleaning after calving (IRR = 0.81) and beef producing value to select bulls (IRR = 0.66). CONCLUSIONS There were substantial differences in identified risk factors in the four models. Some of the factors were in agreement with the reported literature while others were not. This highlights the multifactorial nature of the disease and the differences in the risks for both mastitis manifestations. Attempting to understand these multifactorial associations for mastitis within larger management groups continues to play an important role in mastitis control programs.
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.
Resumo:
Whether anticoagulation management practices are associated with improved outcomes in elderly patients with acute venous thromboembolism (VTE) is uncertain. Thus, we aimed to examine whether practices recommended by the American College of Chest Physicians guidelines are associated with outcomes in elderly patients with VTE. We studied 991 patients aged ≥65 years with acute VTE in a Swiss prospective multicenter cohort study and assessed the adherence to four management practices: parenteral anticoagulation ≥5 days, INR ≥2.0 for ≥24 hours before stopping parenteral anticoagulation, early start with vitamin K antagonists (VKA) ≤24 hours of VTE diagnosis, and the use of low-molecular-weight heparin (LMWH) or fondaparinux. The outcomes were all-cause mortality, VTE recurrence, and major bleeding at 6 months, and the length of hospital stay (LOS). We used Cox regression and lognormal survival models, adjusting for patient characteristics. Overall, 9% of patients died, 3% had VTE recurrence, and 7% major bleeding. Early start with VKA was associated with a lower risk of major bleeding (adjusted hazard ratio 0.37, 95% CI 0.20-0.71). Early start with VKA (adjusted time ratio [TR] 0.77, 95% CI 0.69-0.86) and use of LMWH/fondaparinux (adjusted TR 0.87, 95% CI 0.78-0.97) were associated with a shorter LOS. An INR ≥2.0 for ≥24 hours before stopping parenteral anticoagulants was associated with a longer LOS (adjusted TR 1.2, 95% CI 1.08-1.33). In elderly patients with VTE, the adherence to recommended anticoagulation management practices showed mixed results. In conclusion, only early start with VKA and use of parenteral LMWH/fondaparinux were associated with better outcomes.
Resumo:
During the past years, the industry has shifted position and moved towards “the luxury universe” whose customers are demanding, treating individuals as unique and valued customer for the business, offering vehicles produced with the state of the art technologies and implementing the highest finishing standards. Due to the competitive level in the market, car makers enable processes which equalizes customer services to E.R. management, being dealt with the maximum urgency that allows the comparison between both, car workshops and emergency rooms, where workshop bays or ramps will be equal to emergency boxes and skilled technicians are equivalent to the health care specialist, who will carry out tests and checks prior to afford any final operation, keeping the “patient” under control before it is back to normal utilization. This paper establishes a valid model for the automotive industry to estimate customer service demand forecasting under variable demand conditions using analogies with patient demand models used for the medical ER.
Resumo:
During the past years, the industry has shifted position and moved towards “the luxury universe” whose customers are demanding, treating individuals as unique and valued customer for the business, offering vehicles produced with the state of the art technologies and implementing the highest finishing standards. Due to the competitive level in the market, motor makers enable processes which equalizes customer services to E.R. management, being dealt with the maximum urgency that allows the comparison between both, car workshops and emergency rooms, where workshop bays or ramps will be equal to emergency boxes and skilled technicians are equivalent to the health care specialist, who will carry out tests and checks prior to afford any final operation, keeping the “patient” under control before it is back to normal utilization. This paper ratify a valid model for the automotive industry to estimate customer service demand forecasting under variable demand conditions using analogies with patient demand models used for the medical ER
Resumo:
The interactions among three important issues involved in the implementation of logic programs in parallel (goal scheduling, precedence, and memory management) are discussed. A simplified, parallel memory management model and an efficient, load-balancing goal scheduling strategy are presented. It is shown how, for systems which support "don't know" non-determinism, special care has to be taken during goal scheduling if the space recovery characteristics of sequential systems are to be preserved. A solution based on selecting only "newer" goals for execution is described, and an algorithm is proposed for efficiently maintaining and determining precedence relationships and variable ages across parallel goals. It is argued that the proposed schemes and algorithms make it possible to extend the storage performance of sequential systems to parallel execution without the considerable overhead previously associated with it. The results are applicable to a wide class of parallel and coroutining systems, and they represent an efficient alternative to "all heap" or "spaghetti stack" allocation models.
Resumo:
Replication Data Management (RDM) aims at enabling the use of data collections from several iterations of an experiment. However, there are several major challenges to RDM from integrating data models and data from empirical study infrastructures that were not designed to cooperate, e.g., data model variation of local data sources. [Objective] In this paper we analyze RDM needs and evaluate conceptual RDM approaches to support replication researchers. [Method] We adapted the ATAM evaluation process to (a) analyze RDM use cases and needs of empirical replication study research groups and (b) compare three conceptual approaches to address these RDM needs: central data repositories with a fixed data model, heterogeneous local repositories, and an empirical ecosystem. [Results] While the central and local approaches have major issues that are hard to resolve in practice, the empirical ecosystem allows bridging current gaps in RDM from heterogeneous data sources. [Conclusions] The empirical ecosystem approach should be explored in diverse empirical environments.
Resumo:
Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.
Resumo:
Long-term forecasts of pest pressure are central to the effective management of many agricultural insect pests. In the eastern cropping regions of Australia, serious infestations of Helicoverpa punctigera (Wallengren) and H. armigera (Hübner)(Lepidoptera: Noctuidae) are experienced annually. Regression analyses of a long series of light-trap catches of adult moths were used to describe the seasonal dynamics of both species. The size of the spring generation in eastern cropping zones could be related to rainfall in putative source areas in inland Australia. Subsequent generations could be related to the abundance of various crops in agricultural areas, rainfall and the magnitude of the spring population peak. As rainfall figured prominently as a predictor variable, and can itself be predicted using the Southern Oscillation Index (SOI), trap catches were also related to this variable. The geographic distribution of each species was modelled in relation to climate and CLIMEX was used to predict temporal variation in abundance at given putative source sites in inland Australia using historical meteorological data. These predictions were then correlated with subsequent pest abundance data in a major cropping region. The regression-based and bioclimatic-based approaches to predicting pest abundance are compared and their utility in predicting and interpreting pest dynamics are discussed.
Resumo:
The starting point of the project was the observation that strategic management is absent in small businesses. The first objective of the project was to examine the reasons causing this situation in Greece, the second one, to examine the appropriateness of the contemporary models of strategic planning for the Greek S.M.E.s, and the third to examine the appropriateness of the alternative approaches to strategic management for the Greek S.M.E.s. The term appropriateness includes (a) the ability of managers to use the models and (b) the ability of the models to assist the managers. The results of the research indicate that none of the two above conditions exists, hence, it is suggested that the contemporary models of strategic management are inappropriate for the Greek S.M.E.s. Many previous research projects on the topic suggest that since the strategic decision making process in S.M.E.s is informal, the whole process is absent or ineffective. Current trends in S.M.E.s' strategic management do not consider the informality of the strategic decision making process as a kind of managerial illness, but as a managerial characteristic. The use of sophisticated data collection and analytical methods does not indicate successful strategic decisions, but it indicates the method large firms use to manage their strategy. According to the literature review, the S.M.E.s' managers avoid the use of the contemporary models of strategic management, because they do not have the knowledge, the resources or the time. Another thesis, expressed by some firms' specialists, suggests that small firms are different from large ones, hence their practice of strategic management should not follow the large firm's prototypes.
Resumo:
A prominent theme emerging in Occupational Health and Safety (OSH) is the development of management systems. A range of interventions, according to a prescribed route detailed by one of the management systems, can be introduced into an organisation with some expectation of improved OSH performance. This thesis attempts to identify the key influencing factors that may impact upon the process of introducing interventions, (according to B88800: 1996, Guide to Implementing Occupational Health and Safety Management Systems) into an organisation. To help identify these influencing factors a review of possible models from the sphere of Total Quality Management (TQM) was undertaken and the most suitable TQM model selected for development and use in aSH. By anchoring the aSH model's development in the reviewed literature a range ofeare, medium and low level influencing factors were identified. This model was developed in conjunction with the research data generated within the case study organisation (rubber manufacturer) and applied to the organisation. The key finding was that the implementation of an OSH intervention was dependant upon three broad vectors of influence. These are the Incentive to introduce change within an organisation which refers to the drivers or motivators for OSH. Secondly the Ability within the management team to actually implement the changes refers to aspects, amongst others, such as leadership, commitment and perceptions of OSH. Ability is in turn itself influenced by the environment within which change is being introduced. TItis aspect of Receptivity refers to the history of the plant and characteristics of the workforce. Aspects within Receptivity include workforce profile and organisational policies amongst others. It was found that the TQM model selected and developed for an OSH management system intervention did explain the core influencing factors and their impact upon OSH performance. It was found that within the organisation the results that may have been expected from implementation of BS8800:1996 were not realised. The OSH model highlighted that given the organisation's starting point, a poor appreciation of the human factors of OSH, gave little reward for implementation of an OSH management system. In addition it was found that general organisational culture can effectively suffocate any attempts to generate a proactive safety culture.