898 resultados para Transition to first birth


Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUME: Contexte : l'objectif de cette étude de cohorte prospective était de déterminer la relation entre la survenue d'infections et la dépendance fonctionnelle chez des résidents d'établissements de long séjour durant une période de 6 mois. Population et méthode : les patients inclus (1324 résidents) étaient âgés de 65 ans et plus (âge moyen 85.7 ans, 76.6% de femmes), étaient des résidents de 39 EMS du canton de Vaud. Au baseline, des données démographiques, médicales, concernant les facteurs de risque et protecteurs des infections ont été récoltées. Au cours du suivi de 6 mois, les infirmières des EMS ont documenté la survenue de symptômes et signes d'infection en utilisant les critères développés spécifiquement par l'APIC pour les établissements de long séjour. Les mesures du status fonctionnel ont été évaluées au baseline, à 3 mois et à 6 mois. Deux outcomes différents ont été utilisés : a) le déclin fonctionnel défini comme le décès ou une diminution des capacités fonctionnelles au suivi, b) le status fonctionnel mesuré par une échelle standardisée. Résultats : à la fin du suivi, la mortalité était de 14.6%, similaire pour les résidents avec et sans infection (16.2% versus 13.1%, P .11). Durant les 2 périodes de suivi de 3 mois, les sujets ayant présenté une ou plusieurs infections avaient des odds de déclin fonctionnel plus élevés, y compris après ajustement pour les caractéristiques démographiques, médicales et fonctionnelles du baseline, ainsi que la survenue de nouvelles maladies (odds ratio ajustés (OR) = 1.6, intervalle de confiance à 95% (IC) = 1.2-2.2, P = .002 et OR = 1.5, 95% IC= 1.1-2.0, P= .008, respectivement). Comparés aux résidents non infectés, les odds de déclin fonctionnel augmentaient significativement et graduellement chez ceux ayant eu une, respectivement 2 infections ou plus. L'analyse prédisant le score fonctionnel (restreinte aux sujets ayant survécu) a donné des résultats similaires. Finalement, une analyse de survie prédisant le temps jusqu'à la première infection a confirmé une augmentation progressive de la probabilité d'infection chez les sujets avec dépendance fonctionnelle modérée, respectivement sévère, comparés aux sujets indépendants à la ligne de base. Conclusion : chez les résidents de long séjour, les infections sont à la fois cause et conséquence de la dépendance fonctionnelle. Des études futures devraient être entreprises pour investiguer si des programmes de prévention des infections peuvent également contribuer à prévenir le déclin fonctionnel, un facteur important pour la qualité de vie de ces résidents. ABSTRACT: Objectives: To determine the relationship between infections and functional impairment in nursing home residents. Design: Prospective cohort study (follow-up period, 6 months). Setting: Thirty-nine nursing homes in western Switzerland. Participants: A total of 1,324 residents aged 65 and older (mean age 85.7; 76.6% female) who agreed to participate, or their proxies, by oral informed consent. Measurements: Functional status measured every 3 months. Two different outcomes were used: (a) functional decline defined as death or decreased function at follow-up and (b) functional status score using a standardized measure. Results: At the end of follow-up, mortality was 14.6%, not different for those with and without infection (16.2% vs 13.1%, P= .11) During both 3-month periods, subjects with infection had higher odds of functional decline, even after adjustment for baseline characteristics and occurrence of a new illness (adjusted odds ratio (AOR) = 1.6, 95% confidence interval (CI) = 1.2-2.2, P = .002, and AOR 1.5, 95% CI 1.1-2.0, P .008, respectively). The odds of decline increased in a stepwise fashion in patients with zero, one, and two or more infections. The analyses predicting functional status score (restricted to subjects who survived) gave similar results. A survival analysis predicting time to first infection confirmed a stepwise greater likelihood of infection in subjects -with moderate and severe impairment at baseline than in subjects with no or mild functional impairment at baseline. Conclusion: Infections appear to be both a cause and a consequence of functional impairment in nursing home residents. Further studies should be undertaken to investigate whether effective infection control programs can also contribute to preventing functional decline, an important component of these residents' quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 2012 Iowa Code section 324A.4, subsection 2, states the Iowa Department of Transportation (DOT) “shall biennially prepare a report to be submitted to the general assembly and the governor prior to December 15 of even-numbered years. The report shall recommend methods to increase transportation coordination and improve the efficiency of federal, state, and local government programs used to finance public transit services and may address other topics as appropriate.” Iowa has long been a leader in transportation coordination, from designated public transit agencies covering all 99 counties with little duplication, to requiring any agency receiving public dollars for the provision of transportation to first coordinate with the local public transit agency before providing the transportation on their own, to the creation of the Iowa Transportation Coordination Council. Coordination allows Iowa to provide much needed transportation services to the citizens of Iowa with the most efficient use of public funds. Coordination has been an important topic in Iowa for many years, but during these times of economic constraint and restraint and Iowa’s changing demographics, coordination of transportation services becomes even more critical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Next-generation sequencing techniques such as exome sequencing can successfully detect all genetic variants in a human exome and it has been useful together with the implementation of variant filters to identify causing-disease mutations. Two filters aremainly used for the mutations identification: low allele frequency and the computational annotation of the genetic variant. Bioinformatic tools to predict the effect of a givenvariant may have errors due to the existing bias in databases and sometimes show a limited coincidence among them. Advances in functional and comparative genomics are needed in order to properly annotate these variants.The goal of this study is to: first, functionally annotate Common Variable Immunodeficiency disease (CVID) variants with the available bioinformatic methods in order to assess the reliability of these strategies. Sencondly, as the development of new methods to reduce the number of candidate genetic variants is an active and necessary field of research, we are exploring the utility of gene function information at organism level as a filter for rare disease genes identification. Recently, it has been proposed that only 10-15% of human genes are essential and therefore we would expect that severe rare diseases are mostly caused by mutations on them. Our goal is to determine whether or not these rare and severe diseases are caused by deleterious mutations in these essential genes. If this hypothesis were true, taking into account essential genes as a filter would be an interesting parameter to identify causingdisease mutations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The good news with regard to this (or any) chapter on the future of leadership is that there is one. There was a time when researchers called for a moratorium on new leadership theory and research (e.g., Miner, 1975) citing the uncertain future of the field. Then for a time there was a popular academic perspective that leadership did not really matter when it came to shaping organizational outcomes (Meindl & Ehrlich, 1987; Meindl, Ehrlich, & Dukerich, 1985; Pfeffer, 1977). That perspective was laid to rest by "realists" in the field (Day & Antonakis, 2012a) by means of empirical re-interpretation of the results used to support the position that leadership does not matter (Lieberson & O'Connor, 1972; Salancik & Pfeffer, 1977). Specifically, Day and Lord (1988) showed that when proper methodological concerns were addressed (e.g., controlling for industry and company size effects; incorporating appropriate time lags) that the impact of top-level leadership was considerable - explaining as much as 45% of the variance in measures of organizational performance. Despite some recent pessimistic sentiments about the "curiously unformed" state of leadership research and theory (Hackman & Wageman, 2007), others have argued that the field has continued to evolve and is potentially on the threshold of some significant breakthroughs (Day & Antonakis, 2012a). Leadership scholars have been re-energized by new directions in the field and research efforts have revitalized areas previously abandoned for apparent lack of consistency in findings (e.g., leadership trait theory). Our accumulated knowledge now allows us to explain the nature of leadership including its biological bases and other antecedents, and consequences with some degree of confidence. There are other comprehensive sources that review the extensive theoretical and empirical foundation of leadership (Bass, 2008; Day & Antonakis, 2012b) so that will not be the focus of the present chapter. Instead, we will take a future-oriented perspective in identifying particular areas within the leadership field that we believe offer promising perspectives on the future of leadership. Nonetheless, it is worthwhile as background to first provide an overview of how we see the leadership field changing over the past decade or so. This short chronicle will set the stage for a keener understanding of where the future contributions are likely to emerge. Overall, across nine major schools of leadership - trait, behavioural, contingency, contextual, relational, sceptics, information processing, New Leadership, biological and evolutionary - researchers have seen a resurgence in interest in one area, a high level of activity in at least four other areas, inactivity in three areas, and one that was modestly active in the previous decade but we think holds strong promise for the future (Gardner, Lowe, Moss, Mahoney, & Cogliser, 2010). We will next provide brief overviews of these nine schools and their respective levels of research activity (see Figure 1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary Purpose: Status epilepticus (SE) that is resistant to two antiepileptic compounds is defined as refractory status epilepticus (RSE). In the few available retrospective studies, estimated RSE frequency is between 31% and 43% of patients presenting an SE episode; almost all seem to require a coma induction for treatment. We prospectively assessed RSE frequency, clinical predictors, and outcome in a tertiary clinical setting. Methods: Over 2 years we collected 128 consecutives SE episodes (118 patients) in adults. Clinical data and their relationship to outcome (mortality and return to baseline clinical conditions) were analyzed. Results: Twenty-nine of 128 SE episodes (22.6%) were refractory to first- and second-line antiepileptic treatments. Severity of consciousness impairment and de novo episodes were independent predictors of RSE. RSE showed a worse outcome than non-RSE (39% vs. 11% for mortality; 21% vs. 63% for return to baseline clinical conditions). Only 12 patients with RSE (41%) required coma induction for treatment. Discussion: This prospective study identifies clinical factors predicting the onset of SE refractoriness. RSE appears to be less frequent than previously reported in retrospective studies; furthermore, most RSE episodes were treated outside the intensive care unit (ICU). Nonetheless, we confirm that RSE is characterized by high mortality and morbidity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Asphalt pavements suffer various failures due to insufficient quality within their design lives. The American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide (MEPDG) has been proposed to improve pavement quality through quantitative performance prediction. Evaluation of the actual performance (quality) of pavements requires in situ nondestructive testing (NDT) techniques that can accurately measure the most critical, objective, and sensitive properties of pavement systems. The purpose of this study is to assess existing as well as promising new NDT technologies for quality control/quality assurance (QC/QA) of asphalt mixtures. Specifically, this study examined field measurements of density via the PaveTracker electromagnetic gage, shear-wave velocity via surface-wave testing methods, and dynamic stiffness via the Humboldt GeoGauge for five representative paving projects covering a range of mixes and traffic loads. The in situ tests were compared against laboratory measurements of core density and dynamic modulus. The in situ PaveTracker density had a low correlation with laboratory density and was not sensitive to variations in temperature or asphalt mix type. The in situ shear-wave velocity measured by surface-wave methods was most sensitive to variations in temperature and asphalt mix type. The in situ density and in situ shear-wave velocity were combined to calculate an in situ dynamic modulus, which is a performance-based quality measurement. The in situ GeoGauge stiffness measured on hot asphalt mixtures several hours after paving had a high correlation with the in situ dynamic modulus and the laboratory density, whereas the stiffness measurement of asphalt mixtures cooled with dry ice or at ambient temperature one or more days after paving had a very low correlation with the other measurements. To transform the in situ moduli from surface-wave testing into quantitative quality measurements, a QC/QA procedure was developed to first correct the in situ moduli measured at different field temperatures to the moduli at a common reference temperature based on master curves from laboratory dynamic modulus tests. The corrected in situ moduli can then be compared against the design moduli for an assessment of the actual pavement performance. A preliminary study of microelectromechanical systems- (MEMS)-based sensors for QC/QA and health monitoring of asphalt pavements was also performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary: The transition to and postponing of institutional long-term care of the elderly according to the views of the relatives

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface geological mapping, laboratory measurements of rock properties, and seismic reflection data are integrated through three-dimensional seismic modeling to determine the likely cause of upper crustal reflections and to elucidate the deep structure of the Penninic Alps in eastern Switzerland. Results indicate that the principal upper crustal reflections recorded on the south end of Swiss seismic line NFP20-EAST can be explained by the subsurface geometry of stacked basement nappes. In addition, modeling results provide improvements to structural maps based solely on surface trends and suggest the presence of previously unrecognized rock units in the subsurface. Construction of the initial model is based upon extrapolation of plunging surface. structures; velocities and densities are established by laboratory measurements of corresponding rock units. Iterative modification produces a best fit model that refines the definition of the subsurface geometry of major structures. We conclude that most reflections from the upper 20 km can be ascribed to the presence of sedimentary cover rocks (especially carbonates) and ophiolites juxtaposed against crystalline basement nappes. Thus, in this area, reflections appear to be principally due to first-order lithologic contrasts. This study also demonstrates not only the importance of three-dimensional effects (sideswipe) in interpreting seismic data, but also that these effects can be considered quantitatively through three-dimensional modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current 1993 American Association of State Highway and Transportation Officials (AASHTO) Pavement Design Guide is based on the empirical interpretation of the results of the 1960 AASHTO Road Test. With the release of the new Mechanistic-Empirical (M-E) Pavement Design Guide, pavement design has taken a "quantum" leap forward. In order to effectively and efficiently transition to the M-E Pavement Design Guide, state DOTs need a detailed implementation and training strategy. This document is a plan for the M-E Pavement Design Guide to be implemented in Iowa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this article is to estimate the impact of various factors related to role conflict theory and preference theory on the reduction of women's labour force participation after their transition to parenthood. Objective and subjective dimensions of women's labour force participation are assessed. The empirical test is based on a survey of couples with children in Switzerland. Results show that compared to structural factors associated with role conflict reduction, preferences have little impact on mothers' labour force participation, but explain a good deal of their frustration if the factual situation does not correspond to their wishes. Structural factors, such as occupation, economic resources, childcare, and an urban environment, support mothers' labour force participation, whereas active networks and a home centred lifestyle preference help them to cope with frustrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper explains a teaching project financed by the University of Barcelona (UB). It focuses on ageneric skill of the University of Barcelona, which is defined as "the learning capability andresponsibility”, and in which analytical and synthesis skills are included. It follows a multidisciplinaryapproach including teachers of Mathematics, World Economics and Economic History. All of us sharethe same students during the first and the second course of the grade in Economics at the Faculty ofEconomics and Business. The project has been developed in three stages. The first one has beendone during the first semester of the course 2012/13, being applied to first year students on thesubjects of Mathematics and Economic History. The second phase is being to be done during thesecond semester only on the Economic History subject. A third stage is going to be done next course2013/14 to second year students on the subject of World Economics. Each different teaching teamhas developed specific materials and assessment tools for each one of the subjects included in theproject. The project emphasizes two teaching dimensions: the elaboration of teaching materials topromote the acquisition of generic skills from an interdisciplinary point of view, and the design ofspecific tools to assess such skills. The first results of the first phase of the project shows cleardeficiencies in the analytical skill regarding to first year students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper explains a teaching project financed by the University of Barcelona (UB). It focuses on ageneric skill of the University of Barcelona, which is defined as "the learning capability andresponsibility”, and in which analytical and synthesis skills are included. It follows a multidisciplinaryapproach including teachers of Mathematics, World Economics and Economic History. All of us sharethe same students during the first and the second course of the grade in Economics at the Faculty ofEconomics and Business. The project has been developed in three stages. The first one has beendone during the first semester of the course 2012/13, being applied to first year students on thesubjects of Mathematics and Economic History. The second phase is being to be done during thesecond semester only on the Economic History subject. A third stage is going to be done next course2013/14 to second year students on the subject of World Economics. Each different teaching teamhas developed specific materials and assessment tools for each one of the subjects included in theproject. The project emphasizes two teaching dimensions: the elaboration of teaching materials topromote the acquisition of generic skills from an interdisciplinary point of view, and the design ofspecific tools to assess such skills. The first results of the first phase of the project shows cleardeficiencies in the analytical skill regarding to first year students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper explains a teaching project financed by the University of Barcelona (UB). It focuses on ageneric skill of the University of Barcelona, which is defined as "the learning capability andresponsibility”, and in which analytical and synthesis skills are included. It follows a multidisciplinaryapproach including teachers of Mathematics, World Economics and Economic History. All of us sharethe same students during the first and the second course of the grade in Economics at the Faculty ofEconomics and Business. The project has been developed in three stages. The first one has beendone during the first semester of the course 2012/13, being applied to first year students on thesubjects of Mathematics and Economic History. The second phase is being to be done during thesecond semester only on the Economic History subject. A third stage is going to be done next course2013/14 to second year students on the subject of World Economics. Each different teaching teamhas developed specific materials and assessment tools for each one of the subjects included in theproject. The project emphasizes two teaching dimensions: the elaboration of teaching materials topromote the acquisition of generic skills from an interdisciplinary point of view, and the design ofspecific tools to assess such skills. The first results of the first phase of the project shows cleardeficiencies in the analytical skill regarding to first year students.