989 resultados para Demand information
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
Insect pollination underpins apple production but the extent to which different pollinator guilds supply this service, particularly across different apple varieties, is unknown. Such information is essential if appropriate orchard management practices are to be targeted and proportional to the potential benefits pollinator species may provide. Here we use a novel combination of pollinator effectiveness assays (floral visit effectiveness), orchard field surveys (flower visitation rate) and pollinator dependence manipulations (pollinator exclusion experiments) to quantify the supply of pollination services provided by four different pollinator guilds to the production of four commercial varieties of apple. We show that not all pollinators are equally effective at pollinating apples, with hoverflies being less effective than solitary bees and bumblebees, and the relative abundance of different pollinator guilds visiting apple flowers of different varieties varies significantly. Based on this, the taxa specific economic benefits to UK apple production have been established. The contribution of insect pollinators to the economic output in all varieties was estimated to be £92.1M across the UK, with contributions varying widely across taxa: solitary bees (£51.4M), honeybees (£21.4M), bumblebees (£18.6M) and hoverflies (£0.7M). This research highlights the differences in the economic benefits of four insect pollinator guilds to four major apple varieties in the UK. This information is essential to underpin appropriate investment in pollination services management and provides a model that can be used in other entomolophilous crops to improve our understanding of crop pollination ecology.
Resumo:
Background and problem – As a result of financial crises and the realization of a broader stakeholder network, recent decades have seen an increase in stakeholder demand for non- financial information in corporate reporting. This has led to a situation of information overload where separate financial and sustainability reports have developed in length and complexity interdependent of each other. Integrated reporting has been presented as a solution to this problematic situation. The question is whether the corporate world believe this to be the solution and if the development of corporate reporting is heading in this direction. Purpose - This thesis aims to examine and assess to what extent companies listed on the OMX Stockholm 30 (OMXS30), as per 2016-02-28, comply with the Strategic content element of the <IR> Framework and how this disclosure has developed since the framework’s pilot project and official release by using a self-constructed disclosure index based on its specific items. Methodology – The purpose was fulfilled through an analysis of 104 annual reports comprising 26 companies during the period of 2011-2014. The annual reports were assessed using a self-constructed disclosure index based on the <IR> Framework content element Strategy and Resource Allocation, where one point was given for each disclosed item. Analysis and conclusions – The study found that the OMXS30-listed companies to a large extent complies with the strategic content element of the <IR> Framework and that this compliance has seen a steady growth throughout the researched time span. There is still room for improvement however with a total average framework compliance of 84% for 2014. Although many items are being reported on, there are indications that companies generally miss out on the core values of Integrated reporting.
Resumo:
I investigate the effects of information frictions in price setting decisions. I show that firms' output prices and wages are less sensitive to aggregate economic conditions when firms and workers cannot perfectly understand (or know) the aggregate state of the economy. Prices and wages respond with a lag to aggregate innovations because agents learn slowly about those changes, and this delayed adjustment in prices makes output and unemployment more sensitive to aggregate shocks. In the first chapter of this dissertation, I show that workers' noisy information about the state of the economy help us to explain why real wages are sluggish. In the context of a search and matching model, wages do not immediately respond to a positive aggregate shock because workers do not (yet) have enough information to demand higher wages. This increases firms' incentives to post more vacancies, and it makes unemployment volatile and sensitive to aggregate shocks. This mechanism is robust to two major criticisms of existing theories of sluggish wages and volatile unemployment: the flexibility of wages for new hires and the cyclicality of the opportunity cost of employment. Calibrated to U.S. data, the model explains 60% of the overall unemployment volatility. Consistent with empirical evidence, the response of unemployment to TFP shocks predicted by my model is large, hump-shaped, and peaks one year after the TFP shock, while the response of the aggregate wage is weak and delayed, peaking after two years. In the second chapter of this dissertation, I study the role of information frictions and inventories in firms' price setting decisions in the context of a monetary model. In this model, intermediate goods firms accumulate output inventories, observe aggregate variables with one period lag, and observe their nominal input prices and demand at all times. Firms face idiosyncratic shocks and cannot perfectly infer the state of nature. After a contractionary nominal shock, nominal input prices go down, and firms accumulate inventories because they perceive some positive probability that the nominal price decline is due to a good productivity shock. This prevents firms' prices from decreasing and makes current profits, households' income, and aggregate demand go down. According to my model simulations, a 1% decrease in the money growth rate causes output to decline 0.17% in the first quarter and 0.38% in the second followed by a slow recovery to the steady state. Contractionary nominal shocks also have significant effects on total investment, which remains 1% below the steady state for the first 6 quarters.
Resumo:
Part 5: Service Orientation in Collaborative Networks
Resumo:
A number of important trends currently impact libraries. Academic libraries face the fundamental shift of collections toward ever increasing proportions of electronic content; public libraries continue to see vigorous interest in print materials, now supplemented by demand to provide e-books for lending. Breeding will explore these and other trends and describe some of the technologies available and emerging to help libraries meet the challenges involved in this context.
Resumo:
IS/IT investments are seen has having an enormous potential impact on the competitive position of the firm, on its performance, and demand an active and motivated participation of several stakeholder groups. The shortfall of evidence concerning the productivity of IT became known as the ‘productivity paradox’. As Robert Solow, the Nobel laureate economist stated “we see computers everywhere except in the productivity statistics”. An important stream of research conducted all over the world has tried to understand these phenomena, called in the literature as «IS business value» field. However, there is a gap in the literature, addressing the Portuguese situation. No empirical work has been done to date in order to understand the impact of Information Technology adoption on the productivity of those firms. Using data from two surveys conducted by the Portuguese National Institute of Statistics (INE), Inquiry to the use of IT by Portuguese companies (IUTIC) and the Inquiry Harmonized to (Portuguese) companies (accounting data), this study relates (using regression analysis) the amounts spent on IT with the financial performance indicator Returns on Equity, as a proxy of firm productivity, of Portuguese companies with more than 250 employees. The aim of this paper is to shed light on the Portuguese situation concerning the impact of IS/IT on the productivity of Portuguese top companies. Empirically, we test the impact of IT expenditure on firm productivity of a sample of Portuguese large companies. Our results, based on firm-level data on Information Technology expenditure and firm productivity as measured by return on equity (1186 observations) for the years of 2003 and 2004, exhibit a negative impact of IT expenditure on firm productivity, in line with “productivity paradox” claimants.
Resumo:
One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.
Resumo:
To assess the completeness and reliability of the Information System on Live Births (Sinasc) data. A cross-sectional analysis of the reliability and completeness of Sinasc's data was performed using a sample of Live Birth Certificate (LBC) from 2009, related to births from Campinas, Southeast Brazil. For data analysis, hospitals were grouped according to category of service (Unified National Health System, private or both), 600 LBCs were randomly selected and the data were collected in LBC-copies through mothers and newborns' hospital records and by telephone interviews. The completeness of LBCs was evaluated, calculating the percentage of blank fields, and the LBCs agreement comparing the originals with the copies was evaluated by Kappa and intraclass correlation coefficients. The percentage of completeness of LBCs ranged from 99.8%-100%. For the most items, the agreement was excellent. However, the agreement was acceptable for marital status, maternal education and newborn infants' race/color, low for prenatal visits and presence of birth defects, and very low for the number of deceased children. The results showed that the municipality Sinasc is reliable for most of the studied variables. Investments in training of the professionals are suggested in an attempt to improve system capacity to support planning and implementation of health activities for the benefit of maternal and child population.
Resumo:
El Niño South Oscillation (ENSO) is one climatic phenomenon related to the inter-annual variability of global meteorological patterns influencing sea surface temperature and rainfall variability. It influences human health indirectly through extreme temperature and moisture conditions that may accelerate the spread of some vector-borne viral diseases, like dengue fever (DF). This work examines the spatial distribution of association between ENSO and DF in the countries of the Americas during 1995-2004, which includes the 1997-1998 El Niño, one of the most important climatic events of 20(th) century. Data regarding the South Oscillation index (SOI), indicating El Niño-La Niña activity, were obtained from Australian Bureau of Meteorology. The annual DF incidence (AIy) by country was computed using Pan-American Health Association data. SOI and AIy values were standardised as deviations from the mean and plotted in bars-line graphics. The regression coefficient values between SOI and AIy (rSOI,AI) were calculated and spatially interpolated by an inverse distance weighted algorithm. The results indicate that among the five years registering high number of cases (1998, 2002, 2001, 2003 and 1997), four had El Niño activity. In the southern hemisphere, the annual spatial weighted mean centre of epidemics moved southward, from 6° 31' S in 1995 to 21° 12' S in 1999 and the rSOI,AI values were negative in Cuba, Belize, Guyana and Costa Rica, indicating a synchrony between higher DF incidence rates and a higher El Niño activity. The rSOI,AI map allows visualisation of a graded surface with higher values of ENSO-DF associations for Mexico, Central America, northern Caribbean islands and the extreme north-northwest of South America.
Resumo:
The scope of this study is to identify the prevalence of access to information about how to prevent oral problems among schoolchildren in the public school network, as well as the factors associated with such access. This is a cross-sectional and analytical study conducted among 12-year-old schoolchildren in a Brazilian municipality with a large population. The examinations were performed by 24 trained dentists and calibrated with the aid of 24 recorders. Data collection occurred in 36 public schools selected from the 89 public schools of the city. Descriptive, univariate and multiple analyses were conducted. Of the 2510 schoolchildren included in the study, 2211 reported having received information about how to prevent oral problems. Access to such information was greater among those who used private dental services; and lower among those who used the service for treatment, who evaluated the service as regular or bad/awful. The latter use toothbrush only or toothbrush and tongue scrubbing as a means of oral hygiene and who reported not being satisfied with the appearance of their teeth. The conclusion drawn is that the majority of schoolchildren had access to information about how to prevent oral problems, though access was associated with the characteristics of health services, health behavior and outcomes.
Resumo:
One of the most important properties of quantum dots (QDs) is their size. Their size will determine optical properties and in a colloidal medium their range of interaction. The most common techniques used to measure QD size are transmission electron microscopy (TEM) and X-ray diffraction. However, these techniques demand the sample to be dried and under a vacuum. This way any hydrodynamic information is excluded and the preparation process may alter even the size of the QDs. Fluorescence correlation spectroscopy (FCS) is an optical technique with single molecule sensitivity capable of extracting the hydrodynamic radius (HR) of the QDs. The main drawback of FCS is the blinking phenomenon that alters the correlation function implicating in a QD apparent size smaller than it really is. In this work, we developed a method to exclude blinking of the FCS and measured the HR of colloidal QDs. We compared our results with TEM images, and the HR obtained by FCS is higher than the radius measured by TEM. We attribute this difference to the cap layer of the QD that cannot be seen in the TEM images.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Conforme previsões do último relatório do IPCC (Intergovernmental Panel of Climatic Change) em 2007, até meados deste século haverá um aumento na concentração de CO2 na atmosfera podendo chegar a 720 μmol mol-1. Consequentemente haverá uma elevação da temperatura de até +3 °C, o que ocorrerá em conjunto com mudanças no padrão de precipitação. O mesmo relatório sugere que isto poderá acarretar uma substituição gradual da floresta tropical por vegetação similar a uma savana na parte oriental da Amazônia, porém nada é conclusivo. Diante dessas possibilidades, pergunta-se - Como as espécies de árvores que compõem as regiões de alagamento da Amazônia irão responder às alterações climáticas por vir? Apesar dessas previsões serem pessimistas, o alagamento ainda ocorrerá por vários anos na Amazônia e é de grande importância compreender os efeitos do alagamento sobre as respostas fisiológicas das plantas num contexto das mudanças climáticas. Os principais efeitos sobre a sinalização metabólica e hormonal durante o alagamento são revisados e os possíveis efeitos que as mudanças climáticas poderão ter sobre as plantas amazônicas são discutidos. As informações existentes sugerem que sob alagamento, as plantas tendem a mobilizar reservas para suprir a demanda de carbono necessário para a manutenção do metabolismo sob o estresse da falta de oxigênio. Até certo limite, com o aumento da concentração de CO2, as plantas tendem a fazer mais fotossíntese e a produzir mais biomassa, que poderão aumentar ainda mais com um acréscimo de temperatura de até 3 °C. Alternativamente, com o alagamento, há uma diminuição geral do potencial de crescimento e é possível que quando em condições de CO2 e temperatura elevados os efeitos positivo e negativo se somem. Com isso, as respostas fisiológicas poderão ser amenizadas ou, ainda, promover maior crescimento para a maioria das espécies de regiões alagáveis até o meio do século. Porém, quando a temperatura e o CO2 atingirem valores acima dos ótimos para a maioria das plantas, estas possivelmente diminuirão a atividade fisiológica.
Resumo:
OBJETIVO: Apesar da importância das causas externas como problema de saúde pública, pouco se conhece sobre a demanda de serviços de urgência e emergência. Este estudo tem como objetivo caracterizar a morbidade por causas externas em unidades de urgência e emergência do município de Cuiabá/MT. MÉTODO: Trata-se de um estudo transversal e descritivo. Foram analisadas 3.786 vítimas de causas externas atendidas pelas cinco unidades de urgência e emergência da Secretaria Municipal de Saúde de Cuiabá/MT, no período de 1 de maio a 30 de junho de 2005. RESULTADOS: Aproximadamente 88% dos atendimentos se referiam a vítimas de acidentes, 9% corresponderam a agressões e 2% a lesões autoprovocadas. Os acidentes de transportes representaram 22% dos atendimentos, sendo os motociclistas as principais vítimas (49%); as quedas foram as causas mais freqüentes no grupo de outras causas externas de traumatismos acidentais. A prevalência em homens superou a de mulheres. A maior parte das vítimas era menor de 40 anos (79%). No entanto, a análise por tipo de causa externa apresenta resultados diferentes segundo sexo e faixa etária. Cerca da metade dos eventos ocorreu em casa e, em sua maioria, as vítimas receberam alta após o atendimento, sendo que a taxa de mortalidade foi baixa (0,4%). CONCLUSÃO: Os resultados revelam a importância da análise sistemática dos dados referentes às vítimas de acidentes e violência atendidas em unidades de urgência e emergência, como complemento às informações sobre mortalidade e morbidade hospitalar visando o monitoramento dessas causas.