472 resultados para Establishments
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this work, we address the informational Density Index (IDI) a methodological Option How To be used on the Research Procedures Analysis of Commerce Geography. The IDI and an indicator of Technological Complexity level of economic activities and a review and made a database from CNEFE / NCEA, where the first signifies the National Register of Establishments for statistical purposes and the second means a National Classification Economic Activities. Elaborated with emphasis on discussion of between center and centrality relations, wish to present a constitution of a data base on level sampling with an analysis elaboration do IDI from the criteria: home-page and e - commerce. This methodology helps us understand how polycentric structures, locational enterprise well as a reflection on the middle cities using a through the analysis on city of São Carlos/SP. Where we see the establishments with greater and lesser content as well as its geographical distribution, noting of according to the criteria listed number of establishments with IDI 0 very relevant. Our analysis as well as such divisions that meet greater index, so, IDI 2 like to financial activities. Also present at methodological level the city of São José do Rio Preto/SP paragraph seizure of care when working there in database
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this work, we address the informational Density Index (IDI) a methodological Option How To be used on the Research Procedures Analysis of Commerce Geography. The IDI and an indicator of Technological Complexity level of economic activities and a review and made a database from CNEFE / NCEA, where the first signifies the National Register of Establishments for statistical purposes and the second means a National Classification Economic Activities. Elaborated with emphasis on discussion of between center and centrality relations, wish to present a constitution of a data base on level sampling with an analysis elaboration do IDI from the criteria: home-page and e - commerce. This methodology helps us understand how polycentric structures, locational enterprise well as a reflection on the middle cities using a through the analysis on city of São Carlos/SP. Where we see the establishments with greater and lesser content as well as its geographical distribution, noting of according to the criteria listed number of establishments with IDI 0 very relevant. Our analysis as well as such divisions that meet greater index, so, IDI 2 like to financial activities. Also present at methodological level the city of São José do Rio Preto/SP paragraph seizure of care when working there in database
Resumo:
The research intended to analyze the adoption process of the green certification "Leadership in Energy and Environmental Design" (LEED) from the hotel sector establishments that has already adopted it. For its concretization it was proceeded a bibliographical research, secondary fact-gathering in journals, institutional sites and documentaries, and primary fact-gathering by means of semi structured interviews carried out with responsible people of the certified hotels and of the responsible entity of the certification in Brazil (Green Building Council Brazil). There were 21 interviewee, being 02 of the GBC Brazil and 19 of means of lodging (31% of the certified). For data analysis, it was utilized content analysis technique with the aid of ATLAS.ti software. The results permitted to identify the chronology of the processes of certification and the profile of the hotel categories that adopt the LEED program. Beyond that, the interviews enabled the discussion of the initial motivations for seeking the certification, as well the advantages and the obstacles perceived regarding its adoption.
Resumo:
The growth parameters (growth rate, mu and lag time, lambda) of three different strains each of Salmonella enterica and Listeria monocytogenes in minimally processed lettuce (MPL) and their changes as a function of temperature were modeled. MPL were packed under modified atmosphere (5% O-2, 15% CO2 and 80% N-2), stored at 7-30 degrees C and samples collected at different time intervals were enumerated for S. enterica and L monocytogenes. Growth curves and equations describing the relationship between mu and lambda as a function of temperature were constructed using the DMFit Excel add-in and through linear regression, respectively. The predicted growth parameters for the pathogens observed in this study were compared to ComBase, Pathogen modeling program (PMP) and data from the literature. High R-2 values (0.97 and 0.93) were observed for average growth curves of different strains of pathogens grown on MPL Secondary models of mu and lambda for both pathogens followed a linear trend with high R2 values (>0.90). Root mean square error (RMSE) showed that the models obtained are accurate and suitable for modeling the growth of S. enterica and L monocytogenes in MP lettuce. The current study provides growth models for these foodborne pathogens that can be used in microbial risk assessment. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To investigate the use of nasal intermittent positive pressure ventilation (NIPPV) in level three neonatal intensive care units (NICU) in northeastern Brazil. Methods: This observational cross-sectional survey was conducted from March 2009 to January 2010 in all level three NICUs in northeastern Brazil that are registered in the Brazilian Registry of Health Establishments (Cadastro Nacional de Estabelecimentos de Saude, CNES) of the Ministry of Health. Questionnaires about the use of NIPPV were sent to the NICU directors in each institution. Statistical analysis was conducted using the software Epi-Info 6.04 and double data entry. A chi-square test was used to compare variables, and the level of statistical significance was set at p <= 0.05. Results: This study identified 93 level three NICUs in northeastern Brazil registered in CNES, and 87% answered the study questionnaire. Most classified themselves as private institutions (30.7%); 98.7% used NIPPV; 92.8 % adapted mechanical ventilators for NIPPV and used short binasal prongs as the interface (94.2%). Only 17.3% of the units had a protocol for the use of NIPPV. Mean positive inspiratory pressure and positive end-expiratory pressure were 20.0 cmH(2)O (standard deviation [SD]: 4.47) and 5.0 cmH(2)O (SD: 0.84). Conclusion: NICUs in northeastern Brazil use nasal intermittent positive pressure ventilation, but indications and ventilation settings are not the same in the different institutions.
Resumo:
Background: Although the release of cardiac biomarkers after percutaneous (PCI) or surgical revascularization (CABG) is common, its prognostic significance is not known. Questions remain about the mechanisms and degree of correlation between the release, the volume of myocardial tissue loss, and the long-term significance. Delayed-enhancement of cardiac magnetic resonance (CMR) consistently quantifies areas of irreversible myocardial injury. To investigate the quantitative relationship between irreversible injury and cardiac biomarkers, we will evaluate the extent of irreversible injury in patients undergoing PCI and CABG and relate it to postprocedural modifications in cardiac biomarkers and long-term prognosis. Methods/Design: The study will include 150 patients with multivessel coronary artery disease (CAD) with left ventricle ejection fraction (LVEF) and a formal indication for CABG; 50 patients will undergo CABG with cardiopulmonary bypass (CPB); 50 patients with the same arterial and ventricular condition indicated for myocardial revascularization will undergo CABG without CPB; and another 50 patients with CAD and preserved ventricular function will undergo PCI using stents. All patients will undergo CMR before and after surgery or PCI. We will also evaluate the release of cardiac markers of necrosis immediately before and after each procedure. Primary outcome considered is overall death in a 5-year follow-up. Secondary outcomes are levels of CK-MB isoenzyme and I-Troponin in association with presence of myocardial fibrosis and systolic left ventricle dysfunction assessed by CMR. Discussion: The MASS-V Trial aims to establish reliable values for parameters of enzyme markers of myocardial necrosis in the absence of manifest myocardial infarction after mechanical interventions. The establishments of these indices have diagnostic value and clinical prognosis and therefore require relevant and different therapeutic measures. In daily practice, the inappropriate use of these necrosis markers has led to misdiagnosis and therefore wrong treatment. The appearance of a more sensitive tool such as CMR provides an unprecedented diagnostic accuracy of myocardial damage when correlated with necrosis enzyme markers. We aim to correlate laboratory data with imaging, thereby establishing more refined data on the presence or absence of irreversible myocardial injury after the procedure, either percutaneous or surgical, and this, with or without the use of cardiopulmonary bypass.
Resumo:
Introduction: Brazilian northeast region is historically affected by socioeconomic problems that made this region more needful for strategies regarding to psychiatric disorders assistance. Methods: This study includes original analysis based on data of secondary level health assistance, extracted from Brazil's Hospitalar Information System, Basic Assistance Information System and Brazilian Institute of Geographic and Statistics. Results: Between 2008 and 2010, more than two hundred million dollars were spent by Brazilian federal government to achieve better quality in the assistance for mental health in Northeast. The service network responsible for the treatment of mental disorders in primary care involves a wide range of professionals and establishments. Conclusion: In northeastern Brazil, socioeconomic and geographic conditions contribute to a particular state of vulnerability for the development of psychopathologies. The association of primary care and an integrated network of public health, however, have improved the attention to mental disordersin this region.
Resumo:
This thesis consists of four self-contained essays in economics. Tournaments and unfair treatment. This paper introduces the negative feelings associated with the perception of being unfairly treated into a tournament model and examines the impact of these perceptions on workers’ efforts and their willingness to work overtime. The effect of unfair treatment on workers’ behavior is ambiguous in the model in that two countervailing effects arise: a negative impulsive effect and a positive strategic effect. The impulsive effect implies that workers react to the perception of being unfairly treated by reducing their level of effort. The strategic effect implies that workers raise this level in order to improve their career opportunities and thereby avoid feeling even more unfairly treated in the future. An empirical test of the model using survey data from a Swedish municipal utility shows that the overall effect is negative. This suggests that employers should consider the negative impulsive effect of unfair treatment on effort and overtime in designing contracts and determining on promotions. Late careers in Sweden between 1970 and 2000. In this essay Swedish workers’ late careers between 1970 and 2000 are studied. The aim is to examine older workers’ career patterns and whether they have changed during this period. For example, is there a difference in career mobility or labor market exiting between cohorts? What affects the late career, and does this differ between cohorts? The analysis shows that between 1970 and 2000 the late careers of Swedish workers comprised of few job changes and consisted more of “trying to keep the job you had in your mid-fifties” than of climbing up the promotion ladder. There are no cohort differences in this pattern. Also a large fraction of the older workers exited the labor market before the normal retirement age of 65. During the 1970s and first part of the 1980s, 56 percent of the older workers made an early exit and the average drop-out age was 63. During the late 1980s and the 1990s the share of old workers who made an early exit had risen to 76 percent and the average drop-out age had dropped to 61.5. Different factors have affected the probabilities of an early exit between 1970 and 2000. For example, skills did affect the risk of exiting the labor market during the 1970s and up to the mid-1980s, but not in the late 1980s or the 1990s. During the first period old workers in the lowest occupations or with the lowest level of education were more likely to exit the labor market than more highly skilled workers. In the second period old workers at all levels of skill had the same probability of leaving the labor market. The growth and survival of establishments: does gender segregation matter? We empirically examine the employment dynamics that arise in Becker’s (1957) model of labor market discrimination. According to the model, firms that employ a large fraction of women will be relatively more profitable due to lower wage costs, and thus enjoy a greater probability of surviving and growing by underselling other firms in the competitive product market. In order to test these implications, we use a unique Swedish matched employer-employee data set. We find that female-dominated establishments do not enjoy any greater probability of surviving and do not grow faster than other establishments. Additionally, we find that integrated establishments, in terms of gender, age and education levels, are more successful than other establishments. Thus, attempts by legislators to integrate firms along all dimensions of diversity may have positive effects on the growth and survival of firms. Risk and overconfidence – Gender differences in financial decision-making as revealed in the TV game-show Jeopardy. We have used unique data from the Swedish version of the TV-show Jeopardy to uncover gender differences in financial decision-making by looking at the contestants’ final wagering strategies. After ruling out empirical best-responses, which do appear in Jeopardy in the US, a simple model is derived to show that risk preferences, the subjective and objective probabilities of answering correctly (individual and group competence), determine wagering strategies. The empirical model shows that, on average, women adopt more conservative and diversified strategies, while men’s strategies aim for the greatest gains. Further, women’s strategies are more responsive to the competence measures, which suggests that they are less overconfident. Together these traits make women more successful players. These results are in line with earlier findings on gender and financial trading.
Resumo:
Conferencia pronunciada durante el Acto institucional por la conmemoración del XXV Aniversario de la Facultad de Veterinaria, organizado por don Jorge Orós Montón, decano de la Facultad en el Aula Magna de la Facultad de Veterinaria. El Dr. Laszlo Fodor es Presidente de la European Association of Establishments for Veterinary Education (EAEVE).
Resumo:
[EN] This porfolio is an Objective Structured Competence Examination (OSCE) designed for the Assessment of essentian competences defined by The European Asociation of Establishments for Veterinay Education (EAEVE) in its programme Evaluation of Veterinary Training in Europe
Resumo:
This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.
Resumo:
Urban systems consist of several interlinked sub-systems - social, economic, institutional and environmental – each representing a complex system of its own and affecting all the others at various structural and functional levels. An urban system is represented by a number of “human” agents, such as individuals and households, and “non-human” agents, such as buildings, establishments, transports, vehicles and infrastructures. These two categories of agents interact among them and simultaneously produce impact on the system they interact with. Try to understand the type of interactions, their spatial and temporal localisation to allow a very detailed simulation trough models, turn out to be a great effort and is the topic this research deals with. An analysis of urban system complexity is here presented and a state of the art review about the field of urban models is provided. Finally, six international models - MATSim, MobiSim, ANTONIN, TRANSIMS, UrbanSim, ILUTE - are illustrated and then compared.
Resumo:
Das Standardmodell der Teilchenphysik, das drei der vier fundamentalen Wechselwirkungen beschreibt, stimmt bisher sehr gut mit den Messergebnissen der Experimente am CERN, dem Fermilab und anderen Forschungseinrichtungen überein. rnAllerdings können im Rahmen dieses Modells nicht alle Fragen der Teilchenphysik beantwortet werden. So lässt sich z.B. die vierte fundamentale Kraft, die Gravitation, nicht in das Standardmodell einbauen.rnDarüber hinaus hat das Standardmodell auch keinen Kandidaten für dunkle Materie, die nach kosmologischen Messungen etwa 25 % unseres Universum ausmacht.rnAls eine der vielversprechendsten Lösungen für diese offenen Fragen wird die Supersymmetrie angesehen, die eine Symmetrie zwischen Fermionen und Bosonen einführt. rnAus diesem Modell ergeben sich sogenannte supersymmetrische Teilchen, denen jeweils ein Standardmodell-Teilchen als Partner zugeordnet sind.rnEin mögliches Modell dieser Symmetrie ist das R-Paritätserhaltende mSUGRA-Modell, falls Supersymmetrie in der Natur realisiert ist.rnIn diesem Modell ist das leichteste supersymmetrische Teilchen (LSP) neutral und schwach wechselwirkend, sodass es nicht direkt im Detektor nachgewiesen werden kann, sondern indirekt über die vom LSP fortgetragene Energie, die fehlende transversale Energie (etmiss), nachgewiesen werden muss.rnrnDas ATLAS-Experiment wird 2010 mit Hilfe des pp-Beschleunigers LHC mit einer Schwerpunktenergie von sqrt(s)=7-10 TeV mit einer Luminosität von 10^32 #/(cm^2*s) mit der Suche nach neuer Physik starten.rnDurch die sehr hohe Datenrate, resultierend aus den etwa 10^8 Auslesekanälen des ATLAS-Detektors bei einer Bunchcrossingrate von 40 MHz, wird ein Triggersystem benötigt, um die zu speichernde Datenmenge zu reduzieren.rnDabei muss ein Kompromiss zwischen der verfügbaren Triggerrate und einer sehr hohen Triggereffizienz für die interessanten Ereignisse geschlossen werden, da etwa nur jedes 10^8-te Ereignisse für die Suche nach neuer Physik interessant ist.rnZur Erfüllung der Anforderungen an das Triggersystem wird im Experiment ein dreistufiges System verwendet, bei dem auf der ersten Triggerstufe mit Abstand die höchste Datenreduktion stattfindet.rnrnIm Rahmen dieser Arbeit rn%, die vollständig auf Monte-Carlo-Simulationen basiert, rnist zum einen ein wesentlicher Beitrag zum grundlegenden Verständnis der Eigenschaft der fehlenden transversalen Energie auf der ersten Triggerstufe geleistet worden.rnZum anderen werden Methoden vorgestellt, mit denen es möglich ist, die etmiss-Triggereffizienz für Standardmodellprozesse und mögliche mSUGRA-Szenarien aus Daten zu bestimmen. rnBei der Optimierung der etmiss-Triggerschwellen für die erste Triggerstufe ist die Triggerrate bei einer Luminosität von 10^33 #/(cm^2*s) auf 100 Hz festgelegt worden.rnFür die Triggeroptimierung wurden verschiedene Simulationen benötigt, bei denen eigene Entwicklungsarbeit eingeflossen ist.rnMit Hilfe dieser Simulationen und den entwickelten Optimierungsalgorithmen wird gezeigt, dass trotz der niedrigen Triggerrate das Entdeckungspotential (für eine Signalsignifikanz von mindestens 5 sigma) durch Kombinationen der etmiss-Schwelle mit Lepton bzw. Jet-Triggerschwellen gegenüber dem bestehenden ATLAS-Triggermenü auf der ersten Triggerstufe um bis zu 66 % erhöht wird.