852 resultados para Initial data problem
Resumo:
BACKGROUND: Unsafe abortions are a serious public health problem and a major human rights issue. In low-income countries, where restrictive abortion laws are common, safe abortion care is not always available to women in need. Health care providers have an important role in the provision of abortion services. However, the shortage of health care providers in low-income countries is critical and exacerbated by the unwillingness of some health care providers to provide abortion services. The aim of this study was to identify, summarise and synthesise available research addressing health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia. METHODS: A systematic literature search of three databases was conducted in November 2014, as well as a manual search of reference lists. The selection criteria included quantitative and qualitative research studies written in English, regardless of the year of publication, exploring health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia. The quality of all articles that met the inclusion criteria was assessed. The studies were critically appraised, and thematic analysis was used to synthesise the data. RESULTS: Thirty-six studies, published during 1977 and 2014, including data from 15 different countries, met the inclusion criteria. Nine key themes were identified as influencing the health care providers' attitudes towards induced abortions: 1) human rights, 2) gender, 3) religion, 4) access, 5) unpreparedness, 6) quality of life, 7) ambivalence 8) quality of care and 9) stigma and victimisation. CONCLUSIONS: Health care providers in sub-Saharan Africa and Southeast Asia have moral-, social- and gender-based reservations about induced abortion. These reservations influence attitudes towards induced abortions and subsequently affect the relationship between the health care provider and the pregnant woman who wishes to have an abortion. A values clarification exercise among abortion care providers is needed.
Resumo:
The p-median model is used to locate P facilities to serve a geographically distributed population. Conventionally, it is assumed that the population patronize the nearest facility and that the distance between the resident and the facility may be measured by the Euclidean distance. Carling, Han, and Håkansson (2012) compared two network distances with the Euclidean in a rural region witha sparse, heterogeneous network and a non-symmetric distribution of thepopulation. For a coarse network and P small, they found, in contrast to the literature, the Euclidean distance to be problematic. In this paper we extend their work by use of a refined network and study systematically the case when P is of varying size (2-100 facilities). We find that the network distance give as gooda solution as the travel-time network. The Euclidean distance gives solutions some 2-7 per cent worse than the network distances, and the solutions deteriorate with increasing P. Our conclusions extend to intra-urban location problems.
Resumo:
To have good data quality with high complexity is often seen to be important. Intuition says that the higher accuracy and complexity the data have the better the analytic solutions becomes if it is possible to handle the increasing computing time. However, for most of the practical computational problems, high complexity data means that computational times become too long or that heuristics used to solve the problem have difficulties to reach good solutions. This is even further stressed when the size of the combinatorial problem increases. Consequently, we often need a simplified data to deal with complex combinatorial problems. In this study we stress the question of how the complexity and accuracy in a network affect the quality of the heuristic solutions for different sizes of the combinatorial problem. We evaluate this question by applying the commonly used p-median model, which is used to find optimal locations in a network of p supply points that serve n demand points. To evaluate this, we vary both the accuracy (the number of nodes) of the network and the size of the combinatorial problem (p). The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 supply points we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000 (which is aggregated from the 1.5 million nodes). To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when the accuracy in the road network increase and the combinatorial problem (low p) is simple. When the combinatorial problem is complex (large p) the improvements of increasing the accuracy in the road network are much larger. The results also show that choice of the best accuracy of the network depends on the complexity of the combinatorial (varying p) problem.
Resumo:
As scientific workflows and the data they operate on, grow in size and complexity, the task of defining how those workflows should execute (which resources to use, where the resources must be in readiness for processing etc.) becomes proportionally more difficult. While "workflow compilers", such as Pegasus, reduce this burden, a further problem arises: since specifying details of execution is now automatic, a workflow's results are harder to interpret, as they are partly due to specifics of execution. By automating steps between the experiment design and its results, we lose the connection between them, hindering interpretation of results. To reconnect the scientific data with the original experiment, we argue that scientists should have access to the full provenance of their data, including not only parameters, inputs and intermediary data, but also the abstract experiment, refined into a concrete execution by the "workflow compiler". In this paper, we describe preliminary work on adapting Pegasus to capture the process of workflow refinement in the PASOA provenance system.
Resumo:
We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.
Resumo:
Esta dissertação examina a situação geral dos Acidentes Viários, no contexto do transporte rodoviário, cujas evidências apontam o Fator Humano como o maior responsável por tais eventos. Entende-se que um maior conhecimento sobre ele possibilitará melhorar a segurança do tráfego e da produção transporte. O estudo pretende destacar a importância das análises relacionadas com a atividade transporte rodoviário, as variações da demanda do sistema de circulação e a tarefa do motorista, sob a ótica da ergonomia. Objetiva ele, também, mostrar importância desses estudos para melhor avaliar as interações dos fatores homemmáquina- ambiente viário e para o desenvolvimento de novas tecnologias e produtos de segurança viária. A revisão bibliográfica dos capítulos iniciais revelam o estado da arte e a importância da segurança de trânsito, em nível internacional. Também revelaram que todas nações sofrem do mesmo mal em suas redes viárias, que varia de acordo com a realidade de cada um. Embora o acidente de trânsito seja um fenômeno comum às nações, aqui eles atingiram a dimensão de flagelo social, em razão da sua severidade; e de calamidade econômica, face a elevação dos custos de produção na atividade do transporte rodoviário. São analisadas as características do fator humano, fundamentais na tarefa de condução, e o respectivo nexo causal das falhas com a gênese do acidente, num sistema multifatorial e interativo. O trabalho fundamenta-se em extensa revisão bibliográfica. O estudo de caso, desenvolvido a partir da revisão dos dados de uma pesquisa anterior, comprova a hipótese que o “álcool-direção”, considerado na literatura como o maior causador de acidentes viários, tem sua presença marcada por elevados índices nas rodovias do RS, contrariando a conclusão da pesquisa anterior. Ao final, também oferece recomendações para o desenvolvimento de ações objetivas para melhorar a segurança viária.
Resumo:
The initial endogenous growth models emphasized the importance of externaI effects in explaining sustainable growth across time. Empirically, this hypothesis can be confirmed if the coefficient of physical capital per hour is unity in the aggregate production function. Although cross-section results concur with theory, previous estimates using time series data rejected this hypothesis, showing a small coefficient far from unity. It seems that the problem lies not with the theory but with the techniques employed, which are unable to capture low frequency movements in high frequency data. This paper uses cointegration - a technique designed to capture the existence of long-run relationships in multivariate time series - to test the externalities hypothesis of endogenous growth. The results confirm the theory' and conform to previous cross-section estimates. We show that there is long-run proportionality between output per hour and a measure of capital per hour. U sing this result, we confmn the hypothesis that the implied Solow residual can be explained by government expenditures on infra-structure, which suggests a supply side role for government affecting productivity and a decrease on the extent that the Solow residual explains the variation of output.
Resumo:
Initial endogenous growth models emphasized the importance of external effects and increasing retums in explaining growth. Empirically, this hypothesis can be confumed if the coefficient of physical capital per hour is unity in the aggregate production function. Previous estimates using time series data rejected this hypothesis, although cross-country estimates did nol The problem lies with the techniques employed, which are unable to capture low-frequency movements of high-frequency data. Using cointegration, new time series evidence confum the theory and conform to cross-country evidence. The implied Solow residual, which takes into account externaI effects to aggregate capital, has its behavior analyzed. The hypothesis that it is explained by government expenditures on infrasttucture is confIrmed. This suggests a supply-side role for government affecting productivity.
Resumo:
XML has become an important medium for data exchange, and is frequently used as an interface to - i.e. a view of - a relational database. Although lots of work have been done on querying relational databases through XML views, the problem of updating relational databases through XML views has not received much attention. In this work, we give the rst steps towards solving this problem. Using query trees to capture the notions of selection, projection, nesting, grouping, and heterogeneous sets found throughout most XML query languages, we show how XML views expressed using query trees can be mapped to a set of corresponding relational views. Thus, we transform the problem of updating relational databases through XML views into a classical problem of updating relational databases through relational views. We then show how updates on the XML view are mapped to updates on the corresponding relational views. Existing work on updating relational views can then be leveraged to determine whether or not the relational views are updatable with respect to the relational updates, and if so, to translate the updates to the underlying relational database. Since query trees are a formal characterization of view de nition queries, they are not well suited for end-users. We then investigate how a subset of XQuery can be used as a top level language, and show how query trees can be used as an intermediate representation of view de nitions expressed in this subset.
Resumo:
Este estudo tem por finalidade identificar perspectiva dos profissionais de Recursos Humanos a correlação entre as competências gerenciais e a relação construída entre líderes de diferentes níveis hierárquicos e suas equipes diretas, bem como o impacto na percepção da pressão e do estresse pelas suas respectivas equipes. Um dos desafios que gerentes e outros líderes encontram é como manter um time motivado e mobilizado em um ambiente de pressão crescente, seja decorrente de uma expansão ou uma retração do seu mercado. No primeiro capítulo e no segundo capítulos fazemos a introdução e a definição do Problema, os objetivos e delimitações do estudo. No terceiro capítulo caminhamos pela história do trabalho, da organização e do homem, buscando fundamentar no tempo a jornada em busca do equilíbrio trabalho e trabalhador, o lugar do homem na construção desta história e alguns importantes pensadores da gestão de pessoas. Este capítulo abre portas para entendermos o capítulo seguinte, no qual estudamos o estresse e, em particular, o estresse ocupacional. Mostramos, ainda, a diferença entre pressão – que está no ambiente externo- e estresse, que deriva da percepção do indivíduo. O interesse neste tema é decorrente da constatação prática de que a pressão aumenta em tempos de retração do mercado, mas também aumenta em períodos em o mercado está aquecido. Logo, se houver uma correlação entre como uma equipe ou um liderado ‘percebe’ o seu líder, essa informação poderá ser útil para a Gestão de Pessoas. Aprofundamos o conceito de Suporte Social, a rede que protege o colaborador nem contextos de alta pressão e demos maior ênfase ao papel do líder como importante fonte deste suporte. No quinto capítulo analisamos o tema conflito por ser um dos temas que mais impactam o estresse ocupacional. As relações de trabalho são, por sua natureza, relações de longo prazo, o que faz do manejo adequado do conflito no ambiente de trabalho um assunto relevante na agenda do gerente. Entretanto, seu manejo adequado depende tanto do diagnóstico correto, bem como da relação de confiança construída pelo líder. Embora cientes de que parte significativa dos conflitos está relacionada às estruturas e estratégias corporativas, propositalmente dirigimos nossa atenção aos gerentes e líderes que compõem a massa de liderança dessas organizações visando provocar idéias que possam contribuir para a melhoria das relações e dos resultados. No sexto capitulo apresentamos a Metodologia e no último capítulo apresentamos e discutimos o resultado da pesquisa realizada com profissionais de Recursos Humanos de grandes empresas, buscando, através do olhar do RH, uma percepção sobre as lideranças de cada empresa e a comparação entre líderes com maior ou menor capacidade de negociação junto à sua equipe. O estudo usou uma grade flexível suportada por um questionário semi-estruturado e os dados foram tratados usando a metodologia da Análise de Conteúdo sem que fosse necessário um tratamento estatístico devido ao tamanho da amostra. O estudo apontou para a necessidade de uma exploração mais profunda do tema uma vez que a literatura e os resultados iniciais demonstraram um importante papel da chefia direta na percepção grupal do estresse e no clima organizacional. Ao concluir o trabalho a pesquisa oferece algumas ponderações sobre o papel das lideranças e o papel das corporações na equalização de equipes que vivem sob pressão constante.
Resumo:
O presente estudo teve por objetivo uma verificação experimental da influência da estimulação precoce no desenvolvimento cognitivo. A ideias deste trabalho surgiu da importância que ora se atribui ao problema de um melhor aproveitamento do homem em suas potencialidades para a construção de uma humanidade melhor. Tal possibilidade reflete-se primordialmente no âmago do processo educativo, e em particular a escola. Fundamentou-se teoricamente em Gerome Bruner, tendo por central 2 de suas proposições: Qualquer assunto pode ser ensinado com eficiência, de forma honesta, a qualquer criança em qualquer estágio de desenvolvimento; É objetivo primordial da educação atingir a imaturidade, levando o homem a ser mais produtivo, mais ajustado à sua cultura, desenvolvendo suas potencialidades ou indo além delas para criar mais. Buscando um maior embasamento, a autora se valeu de estudos teóricos e experimentais de Piaget, Montessori, Okón, Poppovic, Oslon, Austin & Postethwaite, e outros, além de algumas pesquisas realizadas por autores ou instituições brasileiras. O estudo dirigiu-se, especificamente, a alunos da 4º série do 1º grau, para medir competência pedagógica e conhecer dados referentes à escolarização inicial. Os resultados obtidos confirmaram a hipótese fundamental de que quanto mais precocemente a criança for estimula da, maior será sua competência pedagógica posterior, o que se pressupõe ser decorrente da aceleração de seu desenvolvimento cognitivo. Entretanto, face às delimitações criadas e às limitações sofridas, a autora sugere a realização de pesquisas ulteriores sobre o tópico, a fim de que se possa, diante de dados experimentais cada vez mais consistentes e de maior amplitude, propor, de forma mais enfática, uma aplicação psicopedagógica adaptada à realidade social e tecnológica de nossa era.
Resumo:
We construct a model in which a first mover decides on its location before it knows the identity of the second mover; joint location results in a negative extemality. Contracts are inherently incomplete since the first mover's initial decision cannot be specified. We analyze several kinds of rights, including damages, injunctions, and rights to exclude (arising from covenants or land ownership). There are cases in which allocating any of these basic rights to the first mover-i.e., first-party rights-is dominated by second-party rights, and cases in which the reverse is true. A Coasian result (efficiency regardless of the rights allocation) only holds under a limited set of conditions. As corollaries of a theorem ranking the basic rights regimes, a number of results emerge contradicting conventional wisdom, including the relative inefficiency of concentrated land ownership and the relevance of the generator's identity. We conclude with a mechanism and a new rights regime that each yield the first best in all cases.
Resumo:
This paper analyzes the demand and cost structure of the French market of academic journals, taking into account its intermediary role between researchers, who are both producers and consumers of knowledge. This two sidedness feature will echoes similar problems already observed in electronic markets – payment card systems, video game console etc - such as the chicken and egg problem, where readers won’t buy a journal if they do not expect its articles to be academically relevant and researchers, that live under the mantra “Publish or Perish”, will not submit to a journal with either limited public reach or weak reputation. After the merging of several databases, we estimate the aggregated nested logit demand system combined simultaneously with a cost function. We identify the structural parameters of this market and find that price elasticities of demand are quite large and margins relatively low, indicating that this industry experiences competitive constraints.
Resumo:
Life cycle general equilibrium models with heterogeneous agents have a very hard time reproducing the American wealth distribution. A common assumption made in this literature is that all young adults enter the economy with no initial assets. In this article, we relax this assumption – not supported by the data - and evaluate the ability of an otherwise standard life cycle model to account for the U.S. wealth inequality. The new feature of the model is that agents enter the economy with assets drawn from an initial distribution of assets, which is estimated using a non-parametric method applied to data from the Survey of Consumer Finances. We found that heterogeneity with respect to initial wealth is key for this class of models to replicate the data. According to our results, American inequality can be explained almost entirely by the fact that some individuals are lucky enough to be born into wealth, while others are born with few or no assets.
Resumo:
Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization