833 resultados para Using mobile phones for development


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis seeks to research patterns of economic growth and development from a number of perspectives often resonated in the growth literature. By addressing themes about history, geography, institutions and culture the thesis is able to bring to bear a wide range of inter-related literatures and methodologies within a single content. Additionally, by targeting different administrative levels in its research design and approach, this thesis is also able to provide a comprehensive treatment of the economic growth dilemma from both cross-national and sub-national perspectives. The three chapters herein discuss economic development from two broad dimensions. The first of these chapters takes on the economic growth inquiry by attempting to incorporate cultural geography within a cross-country formal spatial econometric growth framework. By introducing the global cultural dynamics of languages and ethnic groups as spatial network mechanisms, this chapter is able to distinguish economic growth effects accruing from own-country productive efforts from those accruing from interconnections within a global productive network chain. From this, discussions and deductions about the implications for both developed and developing countries are made as regards potentials for gains and losses from such types and levels of productive integration. The second and third chapters take a different spin to the economic development inquiry. They both focus on economic activity in Africa, tackling the relevant issues from a geo-intersected dimension involving historic regional tribal homelands and modern national and subnational administrative territories. The second chapter specifically focuses on attempting to adopt historical channels to investigate the connection between national institutional quality and economic development in demarcated tribal homelands at the fringes of national African borders. The third chapter on the other hand focuses on looking closer at the effects of demarcations on economic activity. It particularly probes how different kinds of demarcation warranted by two different but very relevant classes of politico-economic players have affected economic activity quite distinguishably within the resulting subnational regions in Africa.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The nitration of tyrosine residues in proteins is associated with nitrosative stress, resulting in the formation of 3-nitrotyrosine (3-NT). 3-NT levels in biological samples have been associated with numerous physiological and pathological conditions. For this reason, several attempts have been made in order to develop methods that accurately quantify 3-NT in biological samples. Regarding chromatographic methods, they seem to be very accurate, showing very good sensibility and specificity. However, accurate quantification of this molecule, which is present at very low concentrations both at physiological and pathological states, is always a complex task and a target of intense research. Objectives: We aimed to develop a simple, rapid, low-cost and sensitive 3-NT quantification method for use in medical laboratories as an additional tool for diagnosis and/or treatment monitoring of a wide range of pathologies. We also aimed to evaluate the performance of the HPLC-based method developed here in a wide range of biological matrices. Material and methods: All experiments were performed on a Hitachi LaChrom Elite® HPLC system and separation was carried out using a Lichrocart® 250-4 Lichrospher 100 RP-18 (5μm) column. The method was further validated according to ICH guidelines. The biological matrices tested were serum, whole blood, urine, B16 F-10 melanoma cell line, growth medium conditioned with the same cell line, bacterial and yeast suspensions. Results: From all the protocols tested, the best results were obtained using 0.5% CH3COOH:MeOH:H2O (15:15:70) as the mobile phase, with detection at wavelengths 215, 276 and 356 nm, at 25ºC, and using a flow rate of 1 mL/min. By using this protocol, it was possible to obtain a linear calibration curve (correlation coefficient = 1), limits of detection and quantification in the order of ng/mL, and a short analysis time (<15 minutes per sample). Additionally, the developed protocol allowed the successful detection and quantification of 3-NT in all biological matrices tested, with detection at 356 nm. Conclusion: The method described in this study, which was successfully developed and validated for 3-NT quantification, is simple, cheap and fast, rendering it suitable for analysis in a wide range of biological matrices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Body composition is affected by diseases, and affects responses to medical treatments, dosage of medicines, etc., while an abnormal body composition contributes to the causation of many chronic diseases. While we have reliable biochemical tests for certain nutritional parameters of body composition, such as iron or iodine status, and we have harnessed nuclear physics to estimate the body’s content of trace elements, the very basic quantification of body fat content and muscle mass remains highly problematic. Both body fat and muscle mass are vitally important, as they have opposing influences on chronic disease, but they have seldom been estimated as part of population health surveillance. Instead, most national surveys have merely reported BMI and waist, or sometimes the waist/hip ratio; these indices are convenient but do not have any specific biological meaning. Anthropometry offers a practical and inexpensive method for muscle and fat estimation in clinical and epidemiological settings; however, its use is imperfect due to many limitations, such as a shortage of reference data, misuse of terminology, unclear assumptions, and the absence of properly validated anthropometric equations. To date, anthropometric methods are not sensitive enough to detect muscle and fat loss. Aims: The aim of this thesis is to estimate Adipose/fat and muscle mass in health disease and during weight loss through; 1. evaluating and critiquing the literature, to identify the best-published prediction equations for adipose/fat and muscle mass estimation; 2. to derive and validate adipose tissue and muscle mass prediction equations; and 3.to evaluate the prediction equations along with anthropometric indices and the best equations retrieved from the literature in health, metabolic illness and during weight loss. Methods: a Systematic review using Cochrane Review method was used for reviewing muscle mass estimation papers that used MRI as the reference method. Fat mass estimation papers were critically reviewed. Mixed ethnic, age and body mass data that underwent whole body magnetic resonance imaging to quantify adipose tissue and muscle mass (dependent variable) and anthropometry (independent variable) were used in the derivation/validation analysis. Multiple regression and Bland-Altman plot were applied to evaluate the prediction equations. To determine how well the equations identify metabolic illness, English and Scottish health surveys were studied. Statistical analysis using multiple regression and binary logistic regression were applied to assess model fit and associations. Also, populations were divided into quintiles and relative risk was analysed. Finally, the prediction equations were evaluated by applying them to a pilot study of 10 subjects who underwent whole-body MRI, anthropometric measurements and muscle strength before and after weight loss to determine how well the equations identify adipose/fat mass and muscle mass change. Results: The estimation of fat mass has serious problems. Despite advances in technology and science, prediction equations for the estimation of fat mass depend on limited historical reference data and remain dependent upon assumptions that have not yet been properly validated for different population groups. Muscle mass does not have the same conceptual problems; however, its measurement is still problematic and reference data are scarce. The derivation and validation analysis in this thesis was satisfactory, compared to prediction equations in the literature they were similar or even better. Applying the prediction equations in metabolic illness and during weight loss presented an understanding on how well the equations identify metabolic illness showing significant associations with diabetes, hypertension, HbA1c and blood pressure. And moderate to high correlations with MRI-measured adipose tissue and muscle mass before and after weight loss. Conclusion: Adipose tissue mass and to an extent muscle mass can now be estimated for many purposes as population or groups means. However, these equations must not be used for assessing fatness and categorising individuals. Further exploration in different populations and health surveys would be valuable.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Particular strengths of the MRC Needs for Care Assessment Schedule have been used to investigate the treatment status of patients with persistent psychiatric disability in ways that other needs assessment tools are unable to. One hundred and seventy-nine such patients from three settings; a private sector psychiatric hospital, two public sector day hospitals situated in the same town, and a high security hospital, were found to have a high level of need. Although there were differences between settings, overall these needs were well met in all three. The high level of persistent disability found amongst these patients could not be attributed to failure on the part of those treating them to use the best available methods, or to failures to comply or engage with treatment on the patient's part. In some two thirds of instances persistent disability was best explained by the fact that even the most suitable available treatments have to be considered only partially effective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automation technologies are widely acclaimed to have the potential to significantly reduce energy consumption and energy-related costs in buildings. However, despite the abundance of commercially available technologies, automation in domestic environments keep on meeting commercial failures. The main reason for this is the development process that is used to build the automation applications, which tend to focus more on technical aspects rather than on the needs and limitations of the users. An instance of this problem is the complex and poorly designed home automation front-ends that deter customers from investing in a home automation product. On the other hand, developing a usable and interactive interface is a complicated task for developers due to the multidisciplinary challenges that need to be identified and solved. In this context, the current research work investigates the different design problems associated with developing a home automation interface as well as the existing design solutions that are applied to these problems. The Qualitative Data Analysis approach was used for collecting data from research papers and the open coding process was used to cluster the findings. From the analysis of the data collected, requirements for designing the interface were derived. A home energy management functionality for a Web-based home automation front-end was developed as a proof-of-concept and a user evaluation was used to assess the usability of the interface. The results of the evaluation showed that this holistic approach to designing interfaces improved its usability which increases the chances of its commercial success.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Home Automation holds the potential of realizing cost savings for end users while reducing the carbon footprint of domestic energy consumption. Yet, adoption is still very low. High cost of vendor-supplied home automation systems is a major prohibiting factor. Open source systems such as FHEM, Domoticz, OpenHAB etc. are a cheaper alternative and can drive the adoption of home automation. Moreover, they have the advantage of not being limited to a single vendor or communication technology which gives end users flexibility in the choice of devices to include in their installation. However, interaction with devices having diverse communication technologies can be inconvenient for users thus limiting the utility they derive from it. For application developers, creating applications which interact with the several technologies in the home automation systems is not a consistent process. Hence, there is the need for a common description mechanism that makes interaction smooth for end users and which enables application developers to make home automation applications in a consistent and uniform way. This thesis proposes such a description mechanism within the context of an open source home automation system – FHEM, together with a system concept for its application. A mobile application was developed as a proof of concept of the proposed description mechanism and the results of the implementation are reflected upon.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Agricultural crops can be damaged by funguses, insects, worms and other organisms that cause diseases and decrease the yield of production. The effect of these damaging agents can be reduced using pesticides. Among them, triazole compounds are effective substances against fungus; for example, Oidium. Nevertheless, it has been detected that the residues of these fungicides in foods as well as in derivate products can affect the health of the consumers. Therefore, the European Union has established several regulations fixing the maximum residue of pesticide levels in a wide range of foods trying to assure the consumer safety. Hence, it is very important to develop adequate methods to determine these pesticide compounds. In most cases, gas or liquid chromatographic (GC, LC) separations are used in the analysis of the samples. But firstly, it is necessary to use proper sample treatments in order to preconcentrate and isolate the target analytes. To reach this aim, microextraction techniques are very effective tools; because allow to do both preconcentration and extraction of the analytes in one simple step that considerably reduces the source of errors. With these objectives, two remarkable techniques have been widely used during the last years: solid phase microextraction (SPME) and liquid phase microextraction (LPME) with its different options. Both techniques that avoid the use or reduce the amount of toxic solvents are convenient coupled to chromatographic equipments providing good quantitative results in a wide number of matrices and compounds. In this work simple and reliable methods have been developed using SPME and ultrasound assisted emulsification microextraction (USAEME) coupled to GC or LC for triazole fungicides determination. The proposed methods allow confidently determine triazole concentrations of μg L‐1 order in different fruit samples. Chemometric tools have been used to accomplish successful determinations. Firstly, in the selection and optimization of the variables involved in the microextraction processes; and secondly, to overcome the problems related to the overlapping peaks. Different fractional factorial designs have been used for the screening of the experimental variables; and central composite designs have been carried out to get the best experimental conditions. Trying to solve the overlapping peak problems multivariate calibration methods have been used. Parallel Factor Analysis 2 (PARAFAC2), Multivariate Curve Resolution (MCR) and Parallel Factor Analysis with Linear Dependencies (PARALIND) have been proposed, the adequate algorithms have been used according to data characteristics, and the results have been compared. Because its occurrence in Basque Country and its relevance in the production of cider and txakoli regional wines the grape and apple samples were selected. These crops are often treated with triazole compounds trying to solve the problems caused by the funguses. The peel and pulp from grape and apple, their juices and some commercial products such as musts, juice and cider have been analysed showing the adequacy of the developed methods for the triazole determination in this kind of fruit samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Los mercados asociados a los servicios de voz móvil a móvil, brindados por operadoras del Sistema Móvil Avanzado en Latinoamérica, han estado sujetos a procesos regulatorios motivados por la dominancia en el mercado de un operador, buscando obtener óptimas condiciones de competencia. Específicamente en Ecuador, la Superintendencia de Telecomunicaciones (Organismo Técnico de Control de Telecomunicaciones) desarrolló un modelo para identificar acciones de regulación que puedan proporcionar al mercado efectos sostenibles de competencia en el largo plazo. Este artículo trata sobre la aplicación de la ingeniería de control para desarrollar un modelo integral del mercado, empleando redes neuronales para la predicción de trarifas de cada operador y un modelo de lógica difusa para predecir la demanda. Adicionalmente, se presenta un modelo de inferencia de lógica difusa para reproducir las estrategias de mercadeo de los operadores y la influencia sobre las tarifas. Dichos modelos permitirían la toma adecuada de decisiones y fueron validados con datos reales.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To develop and optimise some variables that influence fluoxetine orally disintegrating tablets (ODTs) formulation. Methods: Fluoxetine ODTs tablets were prepared using direct compression method. Three-factor, 3- level Box-Behnken design was used to optimize and develop fluoxetine ODT formulation. The design suggested 15 formulations of different lubricant concentration (X1), lubricant mixing time (X2), and compression force (X3) and then their effect was monitored on tablet weight (Y1), thickness (Y2), hardness (Y3), % friability (Y4), and disintegration time (Y5). Results: All powder blends showed acceptable flow properties, ranging from good to excellent. The disintegration time (Y5) was affected directly by lubricant concentration (X1). Lubricant mixing time (X2) had a direct effect on tablet thickness (Y2) and hardness (Y3), while compression force (X3) had a direct impact on tablet hardness (Y3), % friability (Y4) and disintegration time (Y5). Accordingly, Box-Behnken design suggested an optimized formula of 0.86 mg (X1), 15.3 min (X2), and 10.6 KN (X3). Finally, the prediction error percentage responses of Y1, Y2, Y3, Y4, and Y5 were 0.31, 0.52, 2.13, 3.92 and 3.75 %, respectively. Formula 4 and 8 achieved 90 % of drug release within the first 5 min of dissolution test. Conclusion: Fluoxetine ODT formulation has been developed and optimized successfully using Box- Behnken design and has also been manufactured efficiently using direct compression technique.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Back-pressure on a diesel engine equipped with an aftertreatment system is a function of the pressure drop across the individual components of the aftertreatment system, typically, a diesel oxidation catalyst (DOC), catalyzed particulate filter (CPF) and selective catalytic reduction (SCR) catalyst. Pressure drop across the CPF is a function of the mass flow rate and the temperature of the exhaust flowing through it as well as the mass of particulate matter (PM) retained in the substrate wall and the cake layer that forms on the substrate wall. Therefore, in order to control the back-pressure on the engine at low levels and to minimize the fuel consumption, it is important to control the PM mass retained in the CPF. Chemical reactions involving the oxidation of PM under passive oxidation and active regeneration conditions can be utilized with computer numerical models in the engine control unit (ECU) to control the pressure drop across the CPF. Hence, understanding and predicting the filtration and oxidation of PM in the CPF and the effect of these processes on the pressure drop across the CPF are necessary for developing control strategies for the aftertreatment system to reduce back-pressure on the engine and in turn fuel consumption particularly from active regeneration. Numerical modeling of CPF's has been proven to reduce development time and the cost of aftertreatment systems used in production as well as to facilitate understanding of the internal processes occurring during different operating conditions that the particulate filter is subjected to. A numerical model of the CPF was developed in this research work which was calibrated to data from passive oxidation and active regeneration experiments in order to determine the kinetic parameters for oxidation of PM and nitrogen oxides along with the model filtration parameters. The research results include the comparison between the model and the experimental data for pressure drop, PM mass retained, filtration efficiencies, CPF outlet gas temperatures and species (NO2) concentrations out of the CPF. Comparisons of PM oxidation reaction rates obtained from the model calibration to the data from the experiments for ULSD, 10 and 20% biodiesel-blended fuels are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the medical field images obtained from high definition cameras and other medical imaging systems are an integral part of medical diagnosis. The analysis of these images are usually performed by the physicians who sometimes need to spend long hours reviewing the images before they are able to come up with a diagnosis and then decide on the course of action. In this dissertation we present a framework for a computer-aided analysis of medical imagery via the use of an expert system. While this problem has been discussed before, we will consider a system based on mobile devices. Since the release of the iPhone on April 2003, the popularity of mobile devices has increased rapidly and our lives have become more reliant on them. This popularity and the ease of development of mobile applications has now made it possible to perform on these devices many of the image analyses that previously required a personal computer. All of this has opened the door to a whole new set of possibilities and freed the physicians from their reliance on their desktop machines. The approach proposed in this dissertation aims to capitalize on these new found opportunities by providing a framework for analysis of medical images that physicians can utilize from their mobile devices thus remove their reliance on desktop computers. We also provide an expert system to aid in the analysis and advice on the selection of medical procedure. Finally, we also allow for other mobile applications to be developed by providing a generic mobile application development framework that allows for access of other applications into the mobile domain. In this dissertation we outline our work leading towards development of the proposed methodology and the remaining work needed to find a solution to the problem. In order to make this difficult problem tractable, we divide the problem into three parts: the development user interface modeling language and tooling, the creation of a game development modeling language and tooling, and the development of a generic mobile application framework. In order to make this problem more manageable, we will narrow down the initial scope to the hair transplant, and glaucoma domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent advancements in the area of nanotechnology have brought us into a new age of pervasive computing devices. These computing devices grow ever smaller and are being used in ways which were unimaginable before. Recent interest in developing a precise indoor positioning system, as opposed to existing outdoor systems, has given way to much research heading into the area. The use of these small computing devices offers many conveniences for usage in indoor positioning systems. This thesis will deal with using small computing devices Raspberry Pi’s to enable and improve position estimation of mobile devices within closed spaces. The newly patented Orthogonal Perfect DFT Golay coding sequences will be used inside this scenario, and their positioning properties will be tested. After that, testing and comparisons with other coding sequences will be done.