965 resultados para method applied to liquid samples
Resumo:
Microcredit has been a tool to alleviate poverty since long. This research is aimed to observe the efficiency of microcredit in the field of social exclusion. The development of questionnaires and use of existing tools was used to observe the tangible and intangible intertwining of microcredit and by doing so the effort was concentrated to observe whether microcredit has a direct effect on social exclusion or not. Bangladesh was chosen for the field study and 85 samples were taken for the analysis. It is a time period research and one year time was set to receive the sample and working on the statistical analysis. The tangible aspect was based on a World Bank questionnaire and the social capital questionnaire was developed through different well observed tools. The borrowers of Grameen Bank in Bangladesh, is the research sample whish shows a strong correlation between their tangible activity and social life. There are significant changes in tangible aspect and social participation observed from the research. Strong correlation between the two aspects was also found taking into account that the borrowers themselves have a vibrant social life in the village.
Resumo:
The space environment has always been one of the most challenging for communications, both at physical and network layer. Concerning the latter, the most common challenges are the lack of continuous network connectivity, very long delays and relatively frequent losses. Because of these problems, the normal TCP/IP suite protocols are hardly applicable. Moreover, in space scenarios reliability is fundamental. In fact, it is usually not tolerable to lose important information or to receive it with a very large delay because of a challenging transmission channel. In terrestrial protocols, such as TCP, reliability is obtained by means of an ARQ (Automatic Retransmission reQuest) method, which, however, has not good performance when there are long delays on the transmission channel. At physical layer, Forward Error Correction Codes (FECs), based on the insertion of redundant information, are an alternative way to assure reliability. On binary channels, when single bits are flipped because of channel noise, redundancy bits can be exploited to recover the original information. In the presence of binary erasure channels, where bits are not flipped but lost, redundancy can still be used to recover the original information. FECs codes, designed for this purpose, are usually called Erasure Codes (ECs). It is worth noting that ECs, primarily studied for binary channels, can also be used at upper layers, i.e. applied on packets instead of bits, offering a very interesting alternative to the usual ARQ methods, especially in the presence of long delays. A protocol created to add reliability to DTN networks is the Licklider Transmission Protocol (LTP), created to obtain better performance on long delay links. The aim of this thesis is the application of ECs to LTP.
Resumo:
A global metabolic profiling methodology based on gas chromatography coupled to time-of-flight mass spectrometry (GC-TOFMS) for human plasma was applied to a human exercise study focused on the effects of beverages containing glucose, galactose, or fructose taken after exercise and throughout a recovery period of 6 h and 45 min. One group of 10 well trained male cyclists performed 3 experimental sessions on separate days (randomized, single center). After performing a standardized depletion protocol on a bicycle, subjects consumed one of three different beverages: maltodextrin (MD)+glucose (2:1 ratio), MD+galactose (2:1), and MD+fructose (2:1), consumed at an average of 1.25 g of carbohydrate (CHO) ingested per minute. Blood was taken straight after exercise and every 45 min within the recovery phase. With the resulting blood plasma, insulin, free fatty acid (FFA) profile, glucose, and GC-TOFMS global metabolic profiling measurements were performed. The resulting profiling data was able to match the results obtained from the other clinical measurements with the addition of being able to follow many different metabolites throughout the recovery period. The data quality was assessed, with all the labelled internal standards yielding values of <15% CV for all samples (n=335), apart from the labelled sucrose which gave a value of 15.19%. Differences between recovery treatments including the appearance of galactonic acid from the galactose based beverage were also highlighted.
Resumo:
When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.
Resumo:
The Gaussian-3 method developed by Pople and coworkers has been used to calculate the free energy of neutral octamer clusters of water, (H2O)8. The most energetically stable structures are in excellent agreement with those determined from experiment and those predicted from previous high-level calculations. Cubic structures are favored over noncubic structures over all temperature ranges studied. The D2d cubic structure is the lowest free energy structure and dominates the potential energy and free energy hypersurfaces from 0 K to 298 K.
Resumo:
Background The World Health Organization estimates that in sub-Saharan Africa about 4 million HIV-infected patients had started antiretroviral therapy (ART) by the end of 2008. Loss of patients to follow-up and care is an important problem for treatment programmes in this region. As mortality is high in these patients compared to patients remaining in care, ART programmes with high rates of loss to follow-up may substantially underestimate mortality of all patients starting ART. Methods and Findings We developed a nomogram to correct mortality estimates for loss to follow-up, based on the fact that mortality of all patients starting ART in a treatment programme is a weighted average of mortality among patients lost to follow-up and patients remaining in care. The nomogram gives a correction factor based on the percentage of patients lost to follow-up at a given point in time, and the estimated ratio of mortality between patients lost and not lost to follow-up. The mortality observed among patients retained in care is then multiplied by the correction factor to obtain an estimate of programme-level mortality that takes all deaths into account. A web calculator directly calculates the corrected, programme-level mortality with 95% confidence intervals (CIs). We applied the method to 11 ART programmes in sub-Saharan Africa. Patients retained in care had a mortality at 1 year of 1.4% to 12.0%; loss to follow-up ranged from 2.8% to 28.7%; and the correction factor from 1.2 to 8.0. The absolute difference between uncorrected and corrected mortality at 1 year ranged from 1.6% to 9.8%, and was above 5% in four programmes. The largest difference in mortality was in a programme with 28.7% of patients lost to follow-up at 1 year. Conclusions The amount of bias in mortality estimates can be large in ART programmes with substantial loss to follow-up. Programmes should routinely report mortality among patients retained in care and the proportion of patients lost. A simple nomogram can then be used to estimate mortality among all patients who started ART, for a range of plausible mortality rates among patients lost to follow-up.
Resumo:
The project aimed to use results of contamination of city vegetation with heavy metals and sulphur compounds as the basis for analysing the integral response of trees and shrubs to contamination, through a complex method of phytoindication. The results were used to draw up recommendations on pollution reduction in the city and to develop the method of phytoindication as a means of monitoring environmental pollution in St. Petersburg and other large cities. Field investigations were carried out in August 1996, and 66 descriptions of green areas were made in order to estimate the functional state of plants in the Vasileostrovsky district. Investigations of the spectrum reflecting properties of plants showed considerable variation of albedo meanings of leaves under the influence of various internal and external factors. The results indicated that lime trees most closely reflect the condition of the environment. Practically all the green areas studied were in poor condition, the only exceptions being areas of ash trees, which are more resistant to environmental pollution, and one lime-tree alley in a comparatively unpolluted street. The study identified those types of trees which are more or less resistant to complex environmental pollution and Ms. Terekhina recommends that the species in the present green areas be changed to include a higher number of the more resistant species. The turbidimetric analysis of tree barks for sulphates gave an indication of the level and spatial distribution of each pollutant, and the results also confirmed other findings that electric conductivity is a significant feature in determining the extent of sulphate pollution. In testing for various metals, the lime tree showed the highest contents for all elements except magnesium, copper, zinc, cadmium and strontium, again confirming the species' vulnerability to pollution. Medium rates of concentration in the city and environs showed that city plants concentrate 3 times as many different elements and 10 times more chromium, copper and lead than do those in the suburbs. The second stage of the study was based on the concept of phytoindication, which presupposes that changes in the relation of chemical elements in regional biological circulation under the influence of technogenesis provide a criterion for predicting displacements in people's health. There are certain basic factors in this concept. The first is that all living beings are related ecologically as well as by their evolutionary origin, and that the lower an organism is on the evolutionary scale, the less adaptational reserve it has. The second is that smaller concentrations of chemical elements are needed for toxicological influence on plants than on people and so the former's reactions to geochemical factors are easier to characterise. Visual indicational features of urban plants are well defined and can form the basis of a complex "environment - public health" analysis. Specific plant reactions reflecting atmospheric pollution and other components of urbogeosystems make it possible to determine indication criteria for predicting possible disturbances in the general state of health of the population. Thirdly the results of phytoindication investigations must be taken together with information about public health in the area. It only proved possibly to analyse general indexes of public health based on statistical data from the late 1980s and early 1990s as the data of later years were greatly influenced by social factors. These data show that the rates of illness in St. Petersburg (especially for children) are higher than in Russia as a whole, for most classes of diseases, indicating that the population there is more sensitive to the ecological state of the urban environment. The Vasileostrovsky district had the second highest sick rate for adullts, while the rate of infant mortality in the first year of life was highest there. Ms. Terekhina recommends further studies to more precisely assess the effectiveness of the methods she tested, but has drawn up a proposed map of environmental hazard for the population, taking into account prevailing wind directions.
Resumo:
Atmospheric turbulence near the ground severely limits the quality of imagery acquired over long horizontal paths. In defense, surveillance, and border security applications, there is interest in deploying man-portable, embedded systems incorporating image reconstruction methods to compensate turbulence effects. While many image reconstruction methods have been proposed, their suitability for use in man-portable embedded systems is uncertain. To be effective, these systems must operate over significant variations in turbulence conditions while subject to other variations due to operation by novice users. Systems that meet these requirements and are otherwise designed to be immune to the factors that cause variation in performance are considered robust. In addition robustness in design, the portable nature of these systems implies a preference for systems with a minimum level of computational complexity. Speckle imaging methods have recently been proposed as being well suited for use in man-portable horizontal imagers. In this work, the robustness of speckle imaging methods is established by identifying a subset of design parameters that provide immunity to the expected variations in operating conditions while minimizing the computation time necessary for image recovery. Design parameters are selected by parametric evaluation of system performance as factors external to the system are varied. The precise control necessary for such an evaluation is made possible using image sets of turbulence degraded imagery developed using a novel technique for simulating anisoplanatic image formation over long horizontal paths. System performance is statistically evaluated over multiple reconstruction using the Mean Squared Error (MSE) to evaluate reconstruction quality. In addition to more general design parameters, the relative performance the bispectrum and the Knox-Thompson phase recovery methods is also compared. As an outcome of this work it can be concluded that speckle-imaging techniques are robust to the variation in turbulence conditions and user controlled parameters expected when operating during the day over long horizontal paths. Speckle imaging systems that incorporate 15 or more image frames and 4 estimates of the object phase per reconstruction provide up to 45% reduction in MSE and 68% reduction in the deviation. In addition, Knox-Thompson phase recover method is shown to produce images in half the time required by the bispectrum. The quality of images reconstructed using Knox-Thompson and bispectrum methods are also found to be nearly identical. Finally, it is shown that certain blind image quality metrics can be used in place of the MSE to evaluate quality in field scenarios. Using blind metrics rather depending on user estimates allows for reconstruction quality that differs from the minimum MSE by as little as 1%, significantly reducing the deviation in performance due to user action.
Resumo:
Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).
Resumo:
Health needs assessment is an essential step before planning for a new program or evaluating an existing program. The methodology applied follows principles that might differ from one country to another. The purpose of this study was to determine if the methodology applied to assess health needs in the developing nations, particularly Albaqa Refugee Camp in Jordan, differed from the methodology used to assess health needs in developed nations.^ In this study, a method for health needs assessment was developed using the developed countries published literature and was applied to a developing country, Jordan. However, the method did not apply exactly as expected for several reasons. Some of the problems were the incompleteness and unavailability of the health data, and its poor quality in terms of validity and reliability. Thus, some adaptations were needed and a new health needs assessment methodology specific for a particular developing country is proposed. This method depends on utilizing the primary, secondary, and tertiary data, as well as conducting surveys to collect all the data that could not be found in those data sources.^ In general, it was concluded from this study that there is a difference between methodology of a developed country's health needs assessment and a developing country's, specifically Jordan's, health needs assessment. ^
Resumo:
The objective of this paper is to evaluate the behaviour of a controller designed using a parametric Eigenstructure Assignment method and to evaluate its suitability for use in flexible spacecraft. The challenge of this objective lies in obtaining a suitable controller that is specifically designated to alleviate the deflections and vibrations suffered by external appendages in flexible spacecraft while performing attitude manoeuvres. One of the main problems in these vehicles is the mechanical cross-coupling that exists between the rigid and flexible parts of the spacecraft. Spacecraft with fine attitude pointing requirements need precise control of the mechanical coupling to avoid undesired attitude misalignment. In designing an attitude controller, it is necessary to consider the possible vibration of the solar panels and how it may influence the performance of the rest of the vehicle. The nonlinear mathematical model of a flexible spacecraft is considered a close approximation to the real system. During the process of controller evaluation, the design process has also been taken into account as a factor in assessing the robustness of the system.
Resumo:
The results obtained after incorporating the competence “creativity” to the subject Technical Drawing of the first course of the Degree in Forestry, Technical University of Madrid, are presented in this study.At first, learning activities which could serve two functions at the same time -developing students’ creativity and developing other specific competences of the subject- were considered. Besides, changes in the assessment procedure were made and a method which analyzes two aspects of the assessment of the competence creativity was established. On the one hand, the products are evaluated by analyzing the outcomes obtained by students in the essays suggested and by establishing a parameter to assess the creativity expressed in those essays. On the other, an assessment of the student is directly carried out through a psychometric test which has been previously chosen by the team.Moreover, these results can be applied to similar or could be of general application
Resumo:
A Digital Elevation Model (DEM) provides the information basis used for many geographic applications such as topographic and geomorphologic studies, landscape through GIS (Geographic Information Systems) among others. The DEM capacity to represent Earth?s surface depends on the surface roughness and the resolution used. Each DEM pixel depends on the scale used characterized by two variables: resolution and extension of the area studied. DEMs can vary in resolution and accuracy by the production method, although there are statistical characteristics that keep constant or very similar in a wide range of scales. Based on this property, several techniques have been applied to characterize DEM through multiscale analysis directly related to fractal geometry: multifractal spectrum and the structure function. The comparison of the results by both methods is discussed. The study area is represented by a 1024 x 1024 data matrix obtained from a DEM with a resolution of 10 x 10 m each point, which correspond with a region known as ?Monte de El Pardo? a property of Spanish National Heritage (Patrimonio Nacional Español) of 15820 Ha located to a short distance from the center of Madrid. Manzanares River goes through this area from North to South. In the southern area a reservoir is found with a capacity of 43 hm3, with an altitude of 603.3 m till 632 m when it is at the highest capacity. In the middle of the reservoir the minimum altitude of this area is achieved.
Resumo:
This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage.
Resumo:
Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn?t be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don?t have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: ? Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R pvalue. In this way we consider the implications of reducing the number of points. ? Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology.