969 resultados para Munitions makers
Resumo:
The paper examines the concept of the feminization of poverty and reviews the limited evidence on the extent of poverty among women. It then examines the arguments that poor women can be highly effective change agents for the eradication of poverty. However, all the women may be considered as instruments for eliminating poverty, lack of understanding an appreciation of the impact of their sex roles and of gender roles and stereotypes continue to prevent the realization of this potential. Therefore, the paper moves on to summarize the differences between sex and gender and examines how both women's sexes roles and the imp gender roles and stereotypes lead to the feminization of poverty and exclude women from the participation in development and programmes to eliminate poverty. The paper reviews the major approaches: women in development or WID, gender and development or GAD and extension of GAD known as mainstreaming. Finally, it considers the issue of poverty, women and gender in Nigeria. It also advances a number of recommendations on women and gender poverty and rural development for the consideration of policy-makers in Nigeria
Resumo:
The paper viewed the decline in information provision in Nigeria to poor library development, which could be attributed to poor funding. The consequence is that current journal and books are not available in nigerian fisheries libraries. Information which can be regarded as the first factor of production on which other factors like land, labour and capital depend, can only be provided at the right time when libraries are better founded. For now if there must be increase in fish production, poverty alleviation and food security in Nigeria, our fisheries scientists and policy makers will have to rely on international sources of information using the advantage of internet connectivity. Some of such sources discussed in this paper are ASFA, AGORA, FAO DOAJ, FISHBASE, IAMSLIC, INASP, INASP-PERI, INASP-AJOL, ODINAFRICA, SIFAR, WAS, and ABASFR. However, reliance on international sources must not be at the total neglect of harnessing nigerian fisheries information. For the Nigerian Fisheries and Aquatic Sciences Database being developed by NIFFR to attain an international status like those enumerated above, scientists and publishers are requested to take the pain of depositing copies of their publications with NIFFR for inclusion in the Database
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
35 p.
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
16 p.
Resumo:
19 p.
Resumo:
36 p.
Resumo:
O Legislativo é vital para o autogoverno coletivo e para a contenção do poder. Impõe-se revigorá-lo. Esta dissertação traz propostas para incrementar a legitimidade do Poder Legislativo que independem da reforma política. A primeira proposta consiste na correção de algumas práticas comprometedoras da atuação do Legislativo, quais sejam, a falta de apreciação do veto, a atual forma de elaboração e execução da lei orçamentária, o poder excessivo dos líderes e a tutela jurisdicional limitada do devido processo legislativo. A segunda proposta reside no fortalecimento das comissões temáticas, arenas mais adequadas do que o Plenário para desenvolver o potencial deliberativo do Parlamento. Esses órgãos fracionários podem empregar a avaliação de impacto, recurso que se destina a aprimorar a legislação. A terceira proposta corresponde à regulamentação do lobby. A institucionalização dessa atividade revela-se essencial para imprimir-lhe transparência, de modo a possibilitar o controle, e para minimizar o desequilíbrio no acesso aos tomadores de decisão.
Resumo:
The centralized paradigm of a single controller and a single plant upon which modern control theory is built is no longer applicable to modern cyber-physical systems of interest, such as the power-grid, software defined networks or automated highways systems, as these are all large-scale and spatially distributed. Both the scale and the distributed nature of these systems has motivated the decentralization of control schemes into local sub-controllers that measure, exchange and act on locally available subsets of the globally available system information. This decentralization of control logic leads to different decision makers acting on asymmetric information sets, introduces the need for coordination between them, and perhaps not surprisingly makes the resulting optimal control problem much harder to solve. In fact, shortly after such questions were posed, it was realized that seemingly simple decentralized optimal control problems are computationally intractable to solve, with the Wistenhausen counterexample being a famous instance of this phenomenon. Spurred on by this perhaps discouraging result, a concerted 40 year effort to identify tractable classes of distributed optimal control problems culminated in the notion of quadratic invariance, which loosely states that if sub-controllers can exchange information with each other at least as quickly as the effect of their control actions propagates through the plant, then the resulting distributed optimal control problem admits a convex formulation.
The identification of quadratic invariance as an appropriate means of "convexifying" distributed optimal control problems led to a renewed enthusiasm in the controller synthesis community, resulting in a rich set of results over the past decade. The contributions of this thesis can be seen as being a part of this broader family of results, with a particular focus on closing the gap between theory and practice by relaxing or removing assumptions made in the traditional distributed optimal control framework. Our contributions are to the foundational theory of distributed optimal control, and fall under three broad categories, namely controller synthesis, architecture design and system identification.
We begin by providing two novel controller synthesis algorithms. The first is a solution to the distributed H-infinity optimal control problem subject to delay constraints, and provides the only known exact characterization of delay-constrained distributed controllers satisfying an H-infinity norm bound. The second is an explicit dynamic programming solution to a two player LQR state-feedback problem with varying delays. Accommodating varying delays represents an important first step in combining distributed optimal control theory with the area of Networked Control Systems that considers lossy channels in the feedback loop. Our next set of results are concerned with controller architecture design. When designing controllers for large-scale systems, the architectural aspects of the controller such as the placement of actuators, sensors, and the communication links between them can no longer be taken as given -- indeed the task of designing this architecture is now as important as the design of the control laws themselves. To address this task, we formulate the Regularization for Design (RFD) framework, which is a unifying computationally tractable approach, based on the model matching framework and atomic norm regularization, for the simultaneous co-design of a structured optimal controller and the architecture needed to implement it. Our final result is a contribution to distributed system identification. Traditional system identification techniques such as subspace identification are not computationally scalable, and destroy rather than leverage any a priori information about the system's interconnection structure. We argue that in the context of system identification, an essential building block of any scalable algorithm is the ability to estimate local dynamics within a large interconnected system. To that end we propose a promising heuristic for identifying the dynamics of a subsystem that is still connected to a large system. We exploit the fact that the transfer function of the local dynamics is low-order, but full-rank, while the transfer function of the global dynamics is high-order, but low-rank, to formulate this separation task as a nuclear norm minimization problem. Finally, we conclude with a brief discussion of future research directions, with a particular emphasis on how to incorporate the results of this thesis, and those of optimal control theory in general, into a broader theory of dynamics, control and optimization in layered architectures.
Resumo:
Esta dissertação investiga o gênero artigo científico e suas condicionantes culturais, ou seja, as marcas textuais que podem apresentar determinada dificuldade para o tradutor por problemas de interculturalidade, a partir da perspectiva do conceito de normas nos estudos da tradução e da descrição do gênero artigo científico. O objetivo deste trabalho é identificar estas condicionantes em artigos científicos da área de Geriatria e Gerontologia, exemplificando partes deste universo de condicionantes através do levantamento das características desse gênero, assim como da comparação das traduções. Demonstrar através da reflexão teórica, de exemplos práticos e de análises comparativas, como a tradução se beneficia do estudo de gêneros, das normas e do levantamento das condicionantes culturais para auxiliar a tarefa tradutória de artigos científicos. Os procedimentos de análise dos corpora foram baseados no modelo de Lambert e Van Gorp (1985) para a análise da tradução literária, adaptado aqui à tradução técnica. Finalmente, analisando as condicionantes culturais levantadas nesta pesquisa, assim como as características do gênero, o estudo culmina com reflexões a respeito da tradução de artigos científicos
Resumo:
[ES] Recientemente estudios previos han comenzado a plantear la posible relación entre el fenómeno de la internacionalización y responsabilidad social corporativa de las organizaciones. En este trabajo analizamos en qué medida la internacionalización contribuye a una mejora sustancial del desempeño social de las empresas. Usando una muestra de 102 empresas estadounidenses de los sectores químico, energético y de maquinaria industrial, nuestros resultados muestran que no todas las dimensiones de la internacionalización permiten a las empresas adquirir conocimiento valioso que conduzca a una mejora de su desempeño social. Concretamente, comprobamos que, mientras el porcentaje de ventas en mercados extranjeros y la diversificación internacional cultural favorecen la puesta en marcha de prácticas de responsabilidad social corporativa avanzadas que redundan en un mejor desempeño social (cumpliendo con las demandas sociales de los grupos de interés de los distintos mercados donde operan), la diversificación internacional geográfica no tiene efecto directo en el desempeño social de las empresas. Por tanto, no es tan importante el número de mercados donde la empresa actúa, sino el grado de diferenciación cultural existente entre los mismos lo que hace que la propia empresa mejore su desempeño social. El presente trabajo incluye implicaciones relevantes para el mundo académico, directivos y reguladores públicos.
Resumo:
Effects on fish reproduction can result from a variety of toxicity mechanisms first operating at the molecular level. Notably, the presence in the environment of some compounds termed endocrine disrupting chemicals (EDCs) can cause adverse effects on reproduction by interfering with the endocrine system. In some cases, exposure to EDCs leads to the animal feminization and male fish may develop oocytes in testis (intersex condition). Mugilid fish are well suited sentinel organisms to study the effects of reproductive EDCs in the monitoring of estuarine/marine environments. Up-regulation of aromatases and vitellogenins in males and juveniles and the presence of intersex individuals have been described in a wide array of mullet species worldwide. There is a need to develop new molecular markers to identify early feminization responses and intersex condition in fish populations, studying mechanisms that regulate gonad differentiation under exposure to xenoestrogens. Interestingly, an electrophoresis of gonad RNA, shows a strong expression of 5S rRNA in oocytes, indicating the potential of 5S rRNA and its regulating proteins to become useful molecular makers of oocyte presence in testis. Therefore, the use of these oocyte markers to sex and identify intersex mullets could constitute powerful molecular biomarkers to assess xenoestrogenicity in field conditions.
Resumo:
Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: To ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. Copyright: © 2015 Bildosola et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Resumo:
Objetivo. Avaliar o impacto de ações para promoção do consumo de frutas e hortaliças (FH) no ambiente de trabalho. Método. Estudo de intervenção tipo antes e depois não randomizado com grupo controle histórico, realizado em empresa do ramo da pesquisa científica. A população de estudo consistiu em funcionários que almoçaram no restaurante da empresa nos dias do estudo. No diagnóstico inicial, foram coletados dados de caracterização da empresa estudada e da empresa fornecedora de refeições, caracterização sociodemográfica e consumo de FH pelos indivíduos estudados e sua opinião sobre temas ligados a FH. Foi também realizado grupo focal com formadores de opinião da empresa, a fim de conhecer os determinantes do seu consumo de FH com vistas a subsidiar a construção de estratégias para sua promoção. A intervenção durou oito meses, sendo composta por duas vertentes: ambiental (refeitório da empresa) e educativa (dirigida aos indivíduos). Na primeira, investiu-se na sensibilização do proprietário da empresa concessionária e da responsável técnica para a importância da promoção de FH e em contatos regulares com ela a fim subsidiá-la para a oferta de FH no refeitório. Na segunda, foram realizadas atividades presenciais, distribuídos materiais educativos e desenvolvidas ações de comunicação eletrônica. No diagnóstico final, além do consumo de FH pelos funcionários, foram registrados o nível de exposição dos funcionários à intervenção e suas impressões sobre modificações no restaurante da empresa no tocante à oferta de FH. A análise do impacto da intervenção consistiu no exame da relevância da diferença observada entre proporções ou médias obtidas antes e depois da intervenção. A associação entre intervenção e desfechos foi examinada por meio de modelos de regressão múltipla controlando-se para a situação de consumo inicial e para fatores sociodemográficos dos indivíduos. Resultados. Foram estudados 61 indivíduos. A média de cobertura das atividades e materiais educativos foi de 63,5%, sendo esses avaliados positivamente por 98% dos funcionários. Cerca de 2/3 dos funcionários perceberam mudanças em pelo menos dois aspectos referentes a variedade e aparência das preparações. Do total, 88,6% confiavam na higiene desses alimentos no momento pós intervenção contra 56,9% no momento pré intervenção. Houve um aumento de 53,6g (38%) no consumo de FH no almoço realizado no ambiente de trabalho. Houve também aumento no consumo regular de verduras (de 47,5 para 72,1%), e no número médio de dias de consumo de verduras (de 4,4 para 5,6 dias). Foi observada associação entre aumento do consumo de FH e mudança positiva na confiança em relação à higiene dos alimentos oferecidos crus; aumento do consumo de hortaliças e mudança positiva na confiança em relação à higiene dos alimentos oferecidos crus e nível de exposição à vertente educativa da intervenção; e aumento na média de dias de consumo de legumes e percepção de mudanças positivas na variedade e na apresentação das preparações com FH. Conclusão. Houve um aumento no consumo de FH entre funcionários expostos à intervenção. Seu desenho multicomponente parece ter contribuído para os achados do estudo.