963 resultados para OPHTHALMIC SOLUTION 1-PERCENT
Resumo:
The Spanish Ministry of Economy and Competitiveness is funding the SHERIF Research Project, which falls under the INNPACTO pr ogram. This project aims to increase the rate of the existing building refurbishment fro m the energy efficiency point of view by designing a facade system that must be an economica l, flexible and integrated solution 1 . Under this project has been performing several task s regarding the constructive characterization and energy evaluation of the therm al behaviour of facades on existing buildings . In order to perform the latter task, in which this article will focus, has been developing a survey of various buildings in the nei ghbourhood Ciudad de los Angeles, which has as main objective the comparison between the ac tual energy and light behaviour of different buildings, prior and posterior to any ref urbishment works have been undertaken. The evaluation of the actual performance of buildin gs before and after being refurbished is aimed to determine the impact of the work developed as well as learn from the work performed for future interventions.
Resumo:
Recentemente, o uso de persulfato em processo de oxidação química in situ em áreas contaminadas por compostos orgânicos ganhou notoriedade. Contudo, a matriz sólida do solo pode interagir com o persulfato, favorecendo a formação de radicais livres, evitando o acesso do oxidante até o contaminante devido a oxidação de compostos reduzidos presentes no solo ou ainda pela alteração das propriedades hidráulicas do solo. Essa pesquisa teve como objetivos avaliar se as interações entre a solução de persulfato com três solos brasileiros poderiam eventualmente interferir sua capacidade de oxidação bem como se a interação entre eles poderia alterar as propriedades hidráulicas do solo. Para isso, foram realizados ensaios de oxidação do Latossolo Vermelho (LV), Latossolo Vermelho Amarelo (LVA) e Neossolo Quartzarênico (NQ) com solução de persulfato (1g/L e 14g/L) por meio de ensaios de batelada, bem como a oxidação do LV por solução de persulfato (9g/L e 14g/L) em colunas indeformadas. Os resultados mostraram que o decaimento do persulfato seguiu modelo de primeira ordem e o consumo do oxidante não foi finito. A maior constante da taxa de reação (kobs) foi observada para o reator com LV. Essa maior interação foi decorrente da diferença na composição mineralógica e área específica. A caulinita, a gibbsita e os óxidos de ferro apresentaram maior interação com o persulfato. A redução do pH da solução dos reatores causou a lixiviação do alumínio e do ferro devido a dissolução dos minerais. O ferro mobilizado pode ter participado como catalisador da reação, favorecendo a formação de radicais livres, mas foi o principal responsável pelo consumo do oxidante. Parte do ferro oxidado pode ter sido precipitado como óxido cristalino favorecendo a obstrução dos poros. Devido à maior relação entre massa de persulfato e massa de solo, a constante kobs obtida no ensaio com coluna foi 23 vezes maior do que a obtida no ensaio de batelada, mesmo utilizando concentração 1,5 vezes menor no ensaio com coluna. Houve redução na condutividade hidráulica do solo e o fluxo da água mostrou-se heterogêneo após a oxidação devido a mudanças na estrutura dos minerais. Para a remediação de áreas com predomínio de solos tropicais, especialmente do LV, pode ocorrer a formação de radicais livres, mas pode haver um consumo acentuado e não finito do oxidante. Verifica-se que o pH da solução não deve ser inferior a 5 afim de evitar a mobilização de metais para a água subterrânea e eventual obstrução dos poros por meio da desagregação dos grãos de argila.
Resumo:
This thesis uses models of firm-heterogeneity to complete empirical analyses in economic history and agricultural economics. In Chapter 2, a theoretical model of firm heterogeneity is used to derive a statistic that summarizes the welfare gains from the introduction of a new technology. The empirical application considers the use of mechanical steam power in the Canadian manufacturing sector during the late nineteenth century. I exploit exogenous variation in geography to estimate several parameters of the model. My results indicate that the use of steam power resulted in a 15.1 percent increase in firm-level productivity and a 3.0-5.2 percent increase in aggregate welfare. Chapter 3 considers various policy alternatives to price ceiling legislation in the market for production quotas in the dairy farming sector in Quebec. I develop a dynamic model of the demand for quotas with farmers that are heterogeneous in their marginal cost of milk production. The econometric analysis uses farm-level data and estimates a parameter of the theoretical model that is required for the counterfactual experiments. The results indicate that the price of quotas could be reduced to the ceiling price through a 4.16 percent expansion of the aggregate supply of quotas, or through moderate trade liberalization of Canadian dairy products. In Chapter 4, I study the relationship between farm-level productivity and participation in the Commercial Export Milk (CEM) program. I use a difference-in-difference research design with inverse propensity weights to test for causality between participation in the CEM program and total factor productivity (TFP). I find a positive correlation between participation in the CEM program and TFP, however I find no statistically significant evidence that the CEM program affected TFP.
Resumo:
On the morning of March 27th, 2013, a small portion of a much larger landslide complex failed on the western shoreline of central Whidbey Island, Island County, Washington. This landslide, known as the Ledgewood-Bonair Landslide (LB Landslide), mobilized as much as 150,000 cubic meters of unconsolidated glacial sediment onto the coastline of the Puget Sound (Slaughter et al., 2013, Geotechnical Engineering Services, 2013). This study aims to determine how sediment from the Ledgewood-Bonair Landslide has acted on the adjacent beaches 400 meters to the north and south, and specifically to evaluate the volume of sediment contributed by the slide to adjacent beaches, how persistent bluff-derived accretion has been on adjacent beaches, and how intertidal grain sizes changed as a result of the bluff-derived sediment, LiDAR imagery from 2013 and 2014 were differenced and compared to beach profile data and grain size photography. Volume change results indicate that of the 41,850 cubic meters of sediment eroded at the toe of the landslide, 8.9 percent was redeposited on adjacent beaches within 1 year of the landslide. Of this 8.9 percent, 6.3 percent ended up on the north beach and 2.6 percent ended up on the south beach. Because the landslide deposit was primarily sands, silts, and clays, it is reasonable to assume that the remaining 91.1 percent of the sediment eroded from the landslide toe was carried out into the waters of the Puget Sound. Over the course of the two-year study, measurable accretion is apparent up to 150 meters north and 100 meters south of the landslide complex. Profile data also suggests that the most significant elevation changes occurred within the first two and half months since the landslides occurrence. The dominant surficial grain size of the beach soon after the landslide was coarse-sand; in the years following the landslide, 150 meters north of the toe the beach sediment became finer while 100 meters south of the toe the beach sediment became coarser. Overall, the LB Landslide has affected beach profile and grain size only locally, within 150 meters of the landslide toe.
Resumo:
Background The treatment of infants with bronchiolitis is largely supportive. The role of bronchodilators is controversial. Most studies of the use of bronchodilators have enrolled small numbers of subjects and have examined only short-term outcomes, such as clinical scores. Methods We conducted a randomized, double-blind, controlled trial comparing nebulized single-isomer epinephrine with placebo in 194 infants admitted to four hospitals in Queens-land, Australia, with a clinical diagnosis of bronchiolitis. Three 4-ml doses of 1 percent nebulized epinephrine or three 4-ml doses of normal saline were administered at four-hour intervals after hospital admission. Observations were made at admission and just before, 30 minutes after, and 60 minutes after each dose. The primary outcome measures were the length of the hospital stay and the time until the infant was ready for discharge. The secondary outcome measures were the degree of change in the respiratory rate, the heart rate, and the respiratory-effort score and the time that supplemental oxygen was required. Results There were no significant overall differences between the groups in the length of the hospital stay (P=0.16) or the time until the infant was ready for discharge (P=0.86). Among infants who required supplemental oxygen and intravenous fluids, the time until the infant was ready for discharge was significantly longer in the epinephrine group than in the placebo group (P=0.02). The need for supplemental oxygen at admission had the greatest influence on the score for severity of illness and strongly predicted the length of the hospital stay and the time until the infant was ready for discharge (P
Resumo:
PURPOSE: This study has been undertaken to audit a single-center experience with laparoscopically-assisted resection rectopexy for full-thickness rectal prolapse. The clinical Outcomes and long-term results were evaluated. METHODS: The data were prospectively collected for the duration of the operation, time to passage of flatus postoperatively, hospital stay, morbidity, and mortality. For follow-up, patients received a questionnaire or were contacted. The data were divided into quartiles over the study period, and the differences in operating time and length of hospital stay were tested using the Kruskal-Wallis test. RESULTS: Between March 1992 and October 2003, a total of 117 patients underwent laparoscopic resection rectopexy for rectal prolapse. The median operating time during the first quartile (representing the early experience) was 180 minutes compared with 110 minutes for the fourth quartile (Kruskal-Wallis test for operating time = 35.523, 3 df, P < 0.0001). Overall morbidity was 9 percent (ten patients), with one death (< 1 percent). One patient had a ureteric injury requiring conversion. One minor anastomotic leak Occurred, necessitating laparoscopic evacuation of a pelvic abscess. Altogether, 77 patients were available for follow-up. The median follow-up was 62 months. Eighty percent of the patients reported alleviation of their symptoms after the operation. Sixty-nine percent of the constipated patients experienced an improvement in bowel frequency. No patient had new or worsening symptoms of constipation after Surgery. Two (2.5 percent) patients had full-thickness rectal prolapse recurrence. Mucosal prolapse recurred in 14 (18 percent) patients. Anastomotic dilation was performed for stricture in five (4 percent) patients. CONCLUSIONS: Laparoscopically-assisted resection rectopexy for rectal prolapse provides a favorable functional outcome and low recurrence rate. Shorter operating time is achieved with experience. The minimally invasive technique benefits should be considered when offering rectal prolapse patients a transabdominal approach for repair, and emphasis should now be on advanced training in the laparoscopic approach.
Resumo:
Controlling the water content within a product has long been required in the chemical processing, agriculture, food storage, paper manufacturing, semiconductor, pharmaceutical and fuel industries. The limitations of water content measurement as an indicator of safety and quality are attributed to differences in the strength with which water associates with other components in the product. Water activity indicates how tightly water is "bound," structurally or chemically, in products. Water absorption introduces changes in the volume and refractive index of poly(methyl methacrylate) PMMA. Therefore for a grating made in PMMA based optical fiber, its wavelength is an indicator of water absorption and PMMA thus can be used as a water activity sensor. In this work we have investigated the performance of a PMMA based optical fiber grating as a water activity sensor in sugar solution, saline solution and Jet A-1 aviation fuel. Samples of sugar solution with sugar concentration from 0 to 8%, saline solution with concentration from 0 to 22%, and dried (10ppm), ambient (39ppm) and wet (68ppm) aviation fuels were used in experiments. The corresponding water activities are measured as 1.0 to 0.99 for sugar solution, 1.0 to 0.86 for saline solution, and 0.15, 0.57 and 1.0 for the aviation fuel samples. The water content in the measured samples ranges from 100% (pure water) to 10 ppm (dried aviation fuel). The PMMA based optical fiber grating exhibits good sensitivity and consistent response, and Bragg wavelength shifts as large as 3.4 nm when the sensor is transferred from dry fuel to wet fuel. © 2014 Copyright SPIE.
Resumo:
Both light quantity and quality affect the development and autoecology of plants under shade conditions, as in the understorey of tropical forests. However, little research has been directed towards the relative contributions of lowered photosynthetic photon flux density (PPFD) versus altered spectral distributions (as indicated by quantum ratios of 660 to 730 nm, or R:FR) of radiation underneath vegetation canopies. A method for constructing shade enclosures to study the contribution of these two variables is described. Three tropical leguminous vine species (Abrus precatorius L., Caesalpinia bondicela Fleming and Mucuna pruriens (L.) DC.) were grown in two shade enclosures with 3-4% of solar PPFD with either the R:FR of sunlight (1.10) or foliage shade (0.33), and compared to plants grown in sunlight. Most species treated with low R:FR differed from those treated with high R:FR in (1) percent allocation to dry leaf weight, (2) internode length, (3) dry stem weight/length, (4) specific leaf weight, (5) leaf size, and (6) chlorophyll a/b ratios. However, these plants did not differ in chlorophyll content per leaf dry weight or area. In most cases the effects of low R:FR and PPFD were additional to those of high R:FR and low PPFD. Growth patterns varied among the three species, but both low PPFD and diminished R:FR were important cues in their developmental responses to light environments. This shadehouse system should be useful in studying the effects of light on the developmental ecology of other tropical forest plants.
Resumo:
Liquidity is an important attribute of an asset that investors would like to take into consideration when making investment decisions. However, the previous empirical evidence whether liquidity is a determinant of stock return is not unanimous. This dissertation provides a very comprehensive study about the role of liquidity in asset pricing using the Fama-French (1993) three-factor and Kraus and Litzenberger (1976) three-moment CAPM as models for risk adjustment. The relationship between liquidity and well-known determinants of stock returns such as size and book-to-market are also investigated. This study examines the liquidity and asset pricing issues for both intertemporal as well as cross-sectional data. ^ The results indicate an existence of a liquidity premium, i.e., less liquid stocks would demand higher rate of return than more liquid stocks. More specifically, a drop of 1 percent in liquidity is associated with a higher rate of return of about 2 to 3 basis points per month. Further investigation reveals that neither the Fama-French three-factor model nor the three-moment CAPM captures the liquidity premium. Finally, the results show that well-known determinants of stock return such as size and book-to-market do not serve as proxy for liquidity. ^ Overall, this dissertation shows that a liquidity premium exists in the stock market and that liquidity is a distinct effect, and is not influenced by the presence of non-market factors, market factors and other stock characteristics.^
Resumo:
In - Protecting Your Assets: A Well-Defined Credit Policy Is The Key – an essay by Steven V. Moll, Associate Professor, The School of Hospitality Management at Florida International University, Professor Moll observes at the outset: “Bad debts as a percentage of credit sales have climbed to record levels in the industry. The author offers suggestions on protecting assets and working with the law to better manage the business.” “Because of the nature of the hospitality industry and its traditional liberal credit policies, especially in hotels, bad debts as a percentage of credit sales have climbed to record levels,” our author says. “In 1977, hotels showing a net income maintained an average accounts receivable ratio to total sales of 3.4 percent. In 1983, the accounts receivable ratio to total sales increased to 4.1 percent in hotels showing a net income and 4.4 percent in hotels showing a net loss,” he further cites. As the professor implies, there are ways to mitigate the losses from bad credit or difficult to collect credit sales. In this article Professor Moll offers suggestions on how to do that. Moll would suggest that hotels and food & beverage operations initially tighten their credit extension policies, and on the following side, be more aggressive in their collection-of-debt pursuits. There is balance to consider here and bad credit in and of itself as a negative element is not the only reflection the profit/loss mirror would offer. “Credit managers must know what terms to offer in order to compete and afford the highest profit margin allowable,” Moll says. “They must know the risk involved with each guest account and be extremely alert to the rights and wrongs of good credit management,” he advocates. A sound profit policy can be the result of some marginal and additional credit risk on the part of the operation manager. “Reality has shown that high profits, not small credit losses, are the real indicator of good credit management,” the author reveals. “A low bad debt history may indicate that an establishment has an overly conservative credit management policy and is sacrificing potential sales and profits by turning away marginal accounts,” Moll would have you believe, and the science suggests there is no reason not to. Professor Moll does provide a fairly comprehensive list to illustrate when a manager would want to adopt a conservative credit policy. In the final analysis the design is to implement a policy which weighs an acceptable amount of credit risk against a potential profit ratio. In closing, Professor Moll does offer some collection strategies for loose credit accounts, with reference to computer and attorney participation, and brings cash and cash discounts into the discussion as well. Additionally, there is some very useful information about what debt collectors – can’t – do!
Resumo:
In the discussion - Selection Of Students For Hotel Schools: A Comparative Study - by William Morgan, Professor, School of Hospitality Management at Florida International University, Morgan’s initial observation is: “Standards for the selection of students into schools of hospitality management around the world vary considerably when it comes to measuring attitudes toward the industry. The author discusses current standards and recommends some changes.” In addition to intellectual ability, Professor Morgan wants you to know that an intangible element such as attitude is an equally important consideration to students seeking curriculum and careers in the hospitality field. “…breaches in behavior or problems in the tourist employee encounter are often caused by attitudinal conditions which pre exist the training and which were not able to be totally corrected by the unfreezing, movement, and refreezing processes required in attitudinal change,” says Morgan. “…other than for some requirements for level or grade completed or marks obtained, 26 of the 54 countries sampled (48.1 percent) had no pre-selection process at all. Of those having some form of a selection process (in addition to grades), 14 schools in 12 countries (22.2 percent) had a formal admissions examination,” Professor Morgan empirically provides. “It was impossible, however, to determine the scope of this admissions examination as it might relate to attitude.” The attitude intangible is a difficult one to quantify. With an apparent sameness in hotels, restaurants, and their facilities the significant distinctions are to be found in their employees. This makes the selection process for both schools and employers a high priority. Moreover, can a student, or a prospective employee, overcome stereotypes and prejudices to provide a high degree of service in the hospitality industry? This query is an important element of this article. “If utilized in the hotel, technical, or trade school or in the hiring process at the individual facility, this [hiring] process would provide an opportunity to determine if the prospective student or worker is receptive to the training to be received,” advises Professor Morgan. “Such a student or worker is realistic in his aims and aspirations, ready in his ability to receive training, and responsive to the needs of the guest, often someone very different from himself in language, dress, or degree of creature comforts desired,” your author further counsels. Professor Morgan looks to transactional analysis, role playing, languages, and cross cultural education as playing significant roles in producing well intentioned and knowledgeable employees. He expands upon these concepts in the article. Professor Morgan holds The International Center of Glion, Switzerland in high regard and cites that program’s efforts to maintain relationships and provide graduates with ongoing attitudinal enlightenment programs.
Resumo:
Empirical studies of education programs and systems, by nature, rely upon use of student outcomes that are measurable. Often, these come in the form of test scores. However, in light of growing evidence about the long-run importance of other student skills and behaviors, the time has come for a broader approach to evaluating education. This dissertation undertakes experimental, quasi-experimental, and descriptive analyses to examine social, behavioral, and health-related mechanisms of the educational process. My overarching research question is simply, which inside- and outside-the-classroom features of schools and educational interventions are most beneficial to students in the long term? Furthermore, how can we apply this evidence toward informing policy that could effectively reduce stark social, educational, and economic inequalities?
The first study of three assesses mechanisms by which the Fast Track project, a randomized intervention in the early 1990s for high-risk children in four communities (Durham, NC; Nashville, TN; rural PA; and Seattle, WA), reduced delinquency, arrests, and health and mental health service utilization in adolescence through young adulthood (ages 12-20). A decomposition of treatment effects indicates that about a third of Fast Track’s impact on later crime outcomes can be accounted for by improvements in social and self-regulation skills during childhood (ages 6-11), such as prosocial behavior, emotion regulation and problem solving. These skills proved less valuable for the prevention of mental and physical health problems.
The second study contributes new evidence on how non-instructional investments – such as increased spending on school social workers, guidance counselors, and health services – affect multiple aspects of student performance and well-being. Merging several administrative data sources spanning the 1996-2013 school years in North Carolina, I use an instrumental variables approach to estimate the extent to which local expenditure shifts affect students’ academic and behavioral outcomes. My findings indicate that exogenous increases in spending on non-instructional services not only reduce student absenteeism and disciplinary problems (important predictors of long-term outcomes) but also significantly raise student achievement, in similar magnitude to corresponding increases in instructional spending. Furthermore, subgroup analyses suggest that investments in student support personnel such as social workers, health services, and guidance counselors, in schools with concentrated low-income student populations could go a long way toward closing socioeconomic achievement gaps.
The third study examines individual pathways that lead to high school graduation or dropout. It employs a variety of machine learning techniques, including decision trees, random forests with bagging and boosting, and support vector machines, to predict student dropout using longitudinal administrative data from North Carolina. I consider a large set of predictor measures from grades three through eight including academic achievement, behavioral indicators, and background characteristics. My findings indicate that the most important predictors include eighth grade absences, math scores, and age-for-grade as well as early reading scores. Support vector classification (with a high cost parameter and low gamma parameter) predicts high school dropout with the highest overall validity in the testing dataset at 90.1 percent followed by decision trees with boosting and interaction terms at 89.5 percent.
Resumo:
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient’s medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method.
Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated.
Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated.
Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Resumo:
Though significant progress has been made through control efforts in recent years, malaria remains a leading cause of morbidity and mortality throughout the world, with 3.2 billion people at risk of developing the disease. Zanzibar is currently pursuing malaria elimination through the Zanzibar Malaria Elimination Program (ZAMEP), and is working toward a goal of no locally acquired malaria cases by 2018. A comprehensive and well functioning malaria surveillance program is central to achieving this goal. Under ZAMEP’s current surveillance strategy, District Malaria Surveillance Officers (DMSOs) respond to malaria case notifications through the reactive case detection (RACD) system. Three malaria screening and treatment strategies are undertaken in response to this system, including household-level (HSaT), focal-level (FSaT), and mass-level (MSaT). Each strategy is triggered by a different case threshold and tests different-sized populations. The aims of this study were to (1) assess the cost effectiveness of three malaria screening and treatment strategies; (2) assess the timeliness and completeness of ZAMEP’s RACD system; (3) and qualitatively explore the roles of DMSOs.
Screening disposition and budget information for 2014 screening and treatment strategies was analyzed to determine prevalence rates in screened populations and the cost effectiveness of each strategy. Prevalence rates within the screened population varied by strategy: 6.1 percent in HSaT, 1.2 percent in FSaT, and 0.9 percent in MSaT. Of the various costing scenarios considering cost per person screened, MSaT was the most cost-effective, with costs ranging from $9.57 to $12.57 per person screened. Of the various costing scenarios considering cost per case detected, HSaT was the most cost-effective, at $385.51 per case detected.
Case data from 2013 through mid-2015 was used to assess the timeliness and completeness of the RACD system. The average number of RACD activities occurring within 48 hours of notification improved slightly between 2013 and the first half of 2015, from 90.7 percent to 93.1 percent. The average percentage of household members screened during RACD also increased over the same time period, from 84 percent in 2013 to 89.9 percent in the first half of 2015.
Interviews with twenty DMSOs were conducted to gain insights into the challenges to malaria elimination both from the health system and the community perspectives. Major themes discussed in the interviews include the need for additional training, inadequate information capture at health facility, resistance to household testing, transportation difficulties, inadequate personnel during the high transmission season, and community misinformation.
Zanzibar is now considered a low transmission setting, making elimination feasible, but also posing new challenges to achieving this goal. The findings of this study provide insight into how surveillance activities can be improved to support the goal of malaria elimination in Zanzibar. Key changes include reevaluating the use of MSaT activities, improving information capture at health facilities, hiring additional DMSOs during the high transmission season, and improving community communication.
Resumo:
This thesis uses models of firm-heterogeneity to complete empirical analyses in economic history and agricultural economics. In Chapter 2, a theoretical model of firm heterogeneity is used to derive a statistic that summarizes the welfare gains from the introduction of a new technology. The empirical application considers the use of mechanical steam power in the Canadian manufacturing sector during the late nineteenth century. I exploit exogenous variation in geography to estimate several parameters of the model. My results indicate that the use of steam power resulted in a 15.1 percent increase in firm-level productivity and a 3.0-5.2 percent increase in aggregate welfare. Chapter 3 considers various policy alternatives to price ceiling legislation in the market for production quotas in the dairy farming sector in Quebec. I develop a dynamic model of the demand for quotas with farmers that are heterogeneous in their marginal cost of milk production. The econometric analysis uses farm-level data and estimates a parameter of the theoretical model that is required for the counterfactual experiments. The results indicate that the price of quotas could be reduced to the ceiling price through a 4.16 percent expansion of the aggregate supply of quotas, or through moderate trade liberalization of Canadian dairy products. In Chapter 4, I study the relationship between farm-level productivity and participation in the Commercial Export Milk (CEM) program. I use a difference-in-difference research design with inverse propensity weights to test for causality between participation in the CEM program and total factor productivity (TFP). I find a positive correlation between participation in the CEM program and TFP, however I find no statistically significant evidence that the CEM program affected TFP.