756 resultados para height-structured habitat metrics
Resumo:
OBJECTIVE: To analyze the lesions diagnosed in victims of falls, comparing them with those diagnosed in other mechanisms of blunt trauma.METHODS: We conducted a retrospective study of trauma protocol charts (prospectively collected) from 2008 to 2010, including victims of trauma over 13 years of age admitted to the emergency room. The severity of injuries was stratified by the Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). Variables were compared between the group of victims of falls from height (Group 1) and the other victims of blunt trauma (Group 2). We used the Student t, chi-square and Fisher tests for comparison between groups, considering the value of p <0.05 as significant.RESULTS: The series comprised 4,532 cases of blunt trauma, of which 555 (12.2%) were victims of falls from height. Severe lesions (AISe"3) were observed in the extremities (17.5%), in the cephalic segment (8.4%), chest (5.5%) and the abdomen (2.9%). Victims of Group 1 had significantly higher mean age, AIS in extremities / pelvis, AIS in the thoracic segment and ISS (p <0.05). The group 1 had significantly (p <0.05) higher incidence of tracheal intubation on admission, pneumothorax, hemothorax, rib fractures, chest drainage, spinal trauma, pelvic fractures, complex pelvic fractures and fractures to the upper limbs.CONCLUSION: Victims of fall from height had greater anatomic injury severity, greater frequency and severity of lesions in the thoracic segment and extremities.
Resumo:
OBJECTIVE:to identify predictors of death in blunt trauma patients sustaining pelvic fractures and, posteriorly, compare them to a previously reported series from the same center.METHOD: Retrospective analysis of trauma registry data, including blunt trauma patients older than 14 y.o. sustaining pelvic fractures admitted from 2008 to 2010. Patients were assigned into group 1 (dead) or 2 (survivors). We used Student's t, qui square and Fisher's tests for statistical analysis, considering p<0.05 as significant. Posteriorly, we compared predictors of death between both periods.RESULTS: Seventy-nine cases were included. Mean RTS, ISS and TRISS were, respectively, 6.44 + 2.22, 28.0 + 15.2 e 0.74 + 0.33. Nineteen patients died (24,0%). Main cause of death was hemorrhage (42,1%). Group 1 was characterized by (p<0.05) lower systolic blood pressure and Glasgow coma scale means on admission, higher heart rate, head AIS, extremity AIS and ISS means, as well as, higher frequency of severe head injuries and complex pelvic fractures. Comparing both periods, we notice that the anatomic and physiologic severity of injury increased (RTS and ISS means). Furthermore, there was a decrease in the impact of associated thoracic and abdominal injuries on the prognosis and an association of lethality with the presence of complex pelvic fractures.CONCLUSION: There were significant changes in the predictors of death between these two periods. The impact of thoracic and abdominal associated injures decreased while the importance of severe retroperitoneal hemorrhage increased. There was also an increase in trauma severity, which accounted for high lethality.
Resumo:
Metadata in increasing levels of sophistication has been the most powerful concept used in management of unstructured information ever since the first librarian used the Dewey decimal system for library classifications. It remains to be seen, however, what the best approach is to implementing metadata to manage huge volumes of unstructured information in a large organization. Also, once implemented, how is it possible to track whether it is adding value to the company, and whether the implementation has been successful? Existing literature on metadata seems to either focus too much on technical and quality aspects or describe issues with respect to adoption for general information management initiatives. This research therefore, strives to contribute to these gaps: to give a consolidated framework for striving to understand the value added by implementing metadata. The basic methodology used is that of case study, which incorporates aspects of design science, surveys, and interviews in order to provide a holistic approach to quantitative and qualitative analysis of the case. The research identifies the various approaches to implementing metadata, particularly studying the one followed by the unit of analysis of case study, a large company in the Oil and Gas Sector. Of the three approaches identified, the selected company already follows an approach that appears to be superior. The researcher further explores its shortcomings, and proposes a slightly modified approach that can handle them. The research categorically and thoroughly (in context) identifies the top effectiveness criteria, and corresponding key performance indicators(KPIs) that can be measured to understand the level of advancement of the metadata management initiative in the company. In an effort to contrast and have a basis of comparison for the findings, the research also includes views from information managers dealing with core structured data stored in ERPs and other databases. In addition, the results include the basic criteria that can be used to evaluate metrics, in order to classify a metric as a KPI.
Resumo:
This Ph.D. thesis consists of four original papers. The papers cover several topics from geometric function theory, more specifically, hyperbolic type metrics, conformal invariants, and the distortion properties of quasiconformal mappings. The first paper deals mostly with the quasihyperbolic metric. The main result gives the optimal bilipschitz constant with respect to the quasihyperbolic metric for the M¨obius self-mappings of the unit ball. A quasiinvariance property, sharp in a local sense, of the quasihyperbolic metric under quasiconformal mappings is also proved. The second paper studies some distortion estimates for the class of quasiconformal self-mappings fixing the boundary values of the unit ball or convex domains. The distortion is measured by the hyperbolic metric or hyperbolic type metrics. The results provide explicit, asymptotically sharp inequalities when the maximal dilatation of quasiconformal mappings tends to 1. These explicit estimates involve special functions which have a crucial role in this study. In the third paper, we investigate the notion of the quasihyperbolic volume and find the growth estimates for the quasihyperbolic volume of balls in a domain in terms of the radius. It turns out that in the case of domains with Ahlfors regular boundaries, the rate of growth depends not merely on the radius but also on the metric structure of the boundary. The topic of the fourth paper is complete elliptic integrals and inequalities. We derive some functional inequalities and elementary estimates for these special functions. As applications, some functional inequalities and the growth of the exterior modulus of a rectangle are studied.
Resumo:
Outbreaks of stable fly, Stomoxys calcitrans, cause losses for livestock producers located near sugarcane mills in Brazil, especially in southern Mato Grosso do Sul. The sugarcane mills are often pointed by local farmers as the primary source of these outbreaks; some mills also joined the farmers in combating the flies. Brazilian beef cattle production has great economic importance in similar level to bio-fuel production as ethanol. In this context, the wide-ranging knowledge on the biology and ecology of the stable fly, including larval habitats and their reproduction sites is extremely important for further development of control programs. This paper aims to report the occurrence and development of S. calcitrans larvae inside sugarcane stems in three municipalities of Mato Grosso do Sul. The sugarcane stems give protection against bad weather conditions and insecticide application. In this way, for sustainable sugarcane growth specific research concerning this situation should be conducted.
Resumo:
Products developed at industries, institutes and research centers are expected to have high level of quality and performance, having a minimum waste, which require efficient and robust tools to numerically simulate stringent project conditions with great reliability. In this context, Computational Fluid Dynamics (CFD) plays an important role and the present work shows two numerical algorithms that are used in the CFD community to solve the Euler and Navier-Stokes equations applied to typical aerospace and aeronautical problems. Particularly, unstructured discretization of the spatial domain has gained special attention by the international community due to its ease in discretizing complex spatial domains. This work has the main objective of illustrating some advantages and disadvantages of numerical algorithms using structured and unstructured spatial discretization of the flow governing equations. Numerical methods include a finite volume formulation and the Euler and Navier-Stokes equations are applied to solve a transonic nozzle problem, a low supersonic airfoil problem and a hypersonic inlet problem. In a structured context, these problems are solved using MacCormacks implicit algorithm with Steger and Warmings flux vector splitting technique, while, in an unstructured context, Jameson and Mavriplis explicit algorithm is used. Convergence acceleration is obtained using a spatially variable time stepping procedure.
Resumo:
In 1859, Charles Darwin published his theory of evolution by natural selection, the process occurring based on fitness benefits and fitness costs at the individual level. Traditionally, evolution has been investigated by biologists, but it has induced mathematical approaches, too. For example, adaptive dynamics has proven to be a very applicable framework to the purpose. Its core concept is the invasion fitness, the sign of which tells whether a mutant phenotype can invade the prevalent phenotype. In this thesis, four real-world applications on evolutionary questions are provided. Inspiration for the first two studies arose from a cold-adapted species, American pika. First, it is studied how the global climate change may affect the evolution of dispersal and viability of pika metapopulations. Based on the results gained here, it is shown that the evolution of dispersal can result in extinction and indeed, evolution of dispersalshould be incorporated into the viability analysis of species living in fragmented habitats. The second study is focused on the evolution of densitydependent dispersal in metapopulations with small habitat patches. It resulted a very surprising unintuitive evolutionary phenomenon, how a non-monotone density-dependent dispersal may evolve. Cooperation is surprisingly common in many levels of life, despite of its obvious vulnerability to selfish cheating. This motivated two applications. First, it is shown that density-dependent cooperative investment can evolve to have a qualitatively different, monotone or non-monotone, form depending on modelling details. The last study investigates the evolution of investing into two public-goods resources. The results suggest one general path by which labour division can arise via evolutionary branching. In addition to applications, two novel methodological derivations of fitness measures in structured metapopulations are given.
Resumo:
Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.
Resumo:
Vaikka liiketoimintatiedon hallintaa sekä johdon päätöksentekoa on tutkittu laajasti, näiden kahden käsitteen yhteisvaikutuksesta on olemassa hyvin rajallinen määrä tutkimustietoa. Tulevaisuudessa aiheen tärkeys korostuu, sillä olemassa olevan datan määrä kasvaa jatkuvasti. Yritykset tarvitsevat jatkossa yhä enemmän kyvykkyyksiä sekä resursseja, jotta sekä strukturoitua että strukturoimatonta tietoa voidaan hyödyntää lähteestä riippumatta. Nykyiset Business Intelligence -ratkaisut mahdollistavat tehokkaan liiketoimintatiedon hallinnan osana johdon päätöksentekoa. Aiemman kirjallisuuden pohjalta, tutkimuksen empiirinen osuus tunnistaa liiketoimintatiedon hyödyntämiseen liittyviä tekijöitä, jotka joko tukevat tai rajoittavat johdon päätöksentekoprosessia. Tutkimuksen teoreettinen osuus johdattaa lukijan tutkimusaiheeseen kirjallisuuskatsauksen avulla. Keskeisimmät tutkimukseen liittyvät käsitteet, kuten Business Intelligence ja johdon päätöksenteko, esitetään relevantin kirjallisuuden avulla – tämän lisäksi myös dataan liittyvät käsitteet analysoidaan tarkasti. Tutkimuksen empiirinen osuus rakentuu tutkimusteorian pohjalta. Tutkimuksen empiirisessä osuudessa paneudutaan tutkimusteemoihin käytännön esimerkein: kolmen tapaustutkimuksen avulla tutkitaan sekä kuvataan toisistaan irrallisia tapauksia. Jokainen tapaus kuvataan sekä analysoidaan teoriaan perustuvien väitteiden avulla – nämä väitteet ovat perusedellytyksiä menestyksekkäälle liiketoimintatiedon hyödyntämiseen perustuvalle päätöksenteolle. Tapaustutkimusten avulla alkuperäistä tutkimusongelmaa voidaan analysoida tarkasti huomioiden jo olemassa oleva tutkimustieto. Analyysin tulosten avulla myös yksittäisiä rajoitteita sekä mahdollistavia tekijöitä voidaan analysoida. Tulokset osoittavat, että rajoitteilla on vahvasti negatiivinen vaikutus päätöksentekoprosessin onnistumiseen. Toisaalta yritysjohto on tietoinen liiketoimintatiedon hallintaan liittyvistä positiivisista seurauksista, vaikka kaikkia mahdollisuuksia ei olisikaan hyödynnetty. Tutkimuksen merkittävin tulos esittelee viitekehyksen, jonka puitteissa johdon päätöksentekoprosesseja voidaan arvioida sekä analysoida. Despite the fact that the literature on Business Intelligence and managerial decision-making is extensive, relatively little effort has been made to research the relationship between them. This particular field of study has become important since the amount of data in the world is growing every second. Companies require capabilities and resources in order to utilize structured data and unstructured data from internal and external data sources. However, the present Business Intelligence technologies enable managers to utilize data effectively in decision-making. Based on the prior literature, the empirical part of the thesis identifies the enablers and constraints in computer-aided managerial decision-making process. In this thesis, the theoretical part provides a preliminary understanding about the research area through a literature review. The key concepts such as Business Intelligence and managerial decision-making are explored by reviewing the relevant literature. Additionally, different data sources as well as data forms are analyzed in further detail. All key concepts are taken into account when the empirical part is carried out. The empirical part obtains an understanding of the real world situation when it comes to the themes that were covered in the theoretical part. Three selected case companies are analyzed through those statements, which are considered as critical prerequisites for successful computer-aided managerial decision-making. The case study analysis, which is a part of the empirical part, enables the researcher to examine the relationship between Business Intelligence and managerial decision-making. Based on the findings of the case study analysis, the researcher identifies the enablers and constraints through the case study interviews. The findings indicate that the constraints have a highly negative influence on the decision-making process. In addition, the managers are aware of the positive implications that Business Intelligence has for decision-making, but all possibilities are not yet utilized. As a main result of this study, a data-driven framework for managerial decision-making is introduced. This framework can be used when the managerial decision-making processes are evaluated and analyzed.
Resumo:
In recent years, chief information officers (CIOs) around the world have identified Business Intelligence (BI) as their top priority and as the best way to enhance their enterprises competitiveness. Yet, many enterprises are struggling to realize the business value that BI promises. This discrepancy causes important questions, for example: what are the critical success factors of Business Intelligence and, more importantly, how it can be ensured that a Business Intelligence program enhances enterprises competitiveness. The main objective of the study is to find out how it can be ensured that a BI program meets its goals in providing competitive advantage to an enterprise. The objective is approached with a literature review and a qualitative case study. For the literature review the main objective populates three research questions (RQs); RQ1: What is Business Intelligence and why is it important for modern enterprises? RQ2: What are the critical success factors of Business Intelligence programs? RQ3: How it can be ensured that CSFs are met? The qualitative case study covers the BI program of a Finnish global manufacturer company. The research questions for the case study are as follows; RQ4: What is the current state of the case company’s BI program and what are the key areas for improvement? RQ5: In what ways the case company’s Business Intelligence program could be improved? The case company’s BI program is researched using the following methods; action research, semi-structured interviews, maturity assessment and benchmarking. The literature review shows that Business Intelligence is a technology-based information process that contains a series of systematic activities, which are driven by the specific information needs of decision-makers. The objective of BI is to provide accurate, timely, fact-based information, which enables taking actions that lead to achieving competitive advantage. There are many reasons for the importance of Business Intelligence, two of the most important being; 1) It helps to bridge the gap between an enterprise’s current and its desired performance, and 2) It helps enterprises to be in alignment with key performance indicators meaning it helps an enterprise to align towards its key objectives. The literature review also shows that there are known critical success factors (CSFs) for Business Intelligence programs which have to be met if the above mentioned value is wanted to be achieved, for example; committed management support and sponsorship, business-driven development approach and sustainable data quality. The literature review shows that the most common challenges are related to these CSFs and, more importantly, that overcoming these challenges requires a more comprehensive form of BI, called Enterprise Performance Management (EPM). EPM links measurement to strategy by focusing on what is measured and why. The case study shows that many of the challenges faced in the case company’s BI program are related to the above-mentioned CSFs. The main challenges are; lack of support and sponsorship from business, lack of visibility to overall business performance, lack of rigid BI development process, lack of clear purpose for the BI program and poor data quality. To overcome these challenges the case company should define and design an enterprise metrics framework, make sure that BI development requirements are gathered and prioritized by business, focus on data quality and ownership, and finally define clear goals for the BI program and then support and sponsor these goals.
Resumo:
The objective of the present study was to analyze the influence of spray mixture volume and flight height on herbicide deposition in aerial applications on pastures. The experimental plots were arranged in a pasture area in the district of Porto Esperidião (Mato Grosso, Brazil). In all of the treatments, the applications contained the herbicides aminopyralid and fluroxypyr (Dominum) at the dose of 2.5 L c.p. ha-1, including the adjuvant mineral oil (Joint Oil) at the dose of 1.0 L and a tracer to determine the deposition by high-performance liquid chromatography (HPLC) (rhodamine at a concentration of 0.6%). The experiment consisted of nine treatments that comprised the combinations of three spray volumes (20, 30 and 50 L ha-1) and three flight heights (10, 30 and 40 m). The results showed that, on average, there was a tendency for larger deposits for the smallest flight heights, with a significant difference between the heights of 10 and 40 m. There was no significant difference among the deposits obtained with the different spray mixture volumes.