868 resultados para Value analysis
Resumo:
J.A. Ferreira Neto, E.C. Santos Junior, U. Fra Paleo, D. Miranda Barros, and M.C.O. Moreira. 2011. Optimal subdivision of land in agrarian reform projects: an analysis using genetic algorithms. Cien. Inv. Agr. 38(2): 169-178. The objective of this manuscript is to develop a new procedure to achieve optimal land subdivision using genetic algorithms (GA). The genetic algorithm was tested in the rural settlement of Veredas, located in Minas Gerais, Brazil. This implementation was based on the land aptitude and its productivity index. The sequence of tests in the study was carried out in two areas with eight different agricultural aptitude classes, including one area of 391.88 ha subdivided into 12 lots and another of 404.1763 ha subdivided into 14 lots. The effectiveness of the method was measured using the shunting line standard value of a parceled area lot`s productivity index. To evaluate each parameter, a sequence of 15 calculations was performed to record the best individual fitness average (MMI) found for each parameter variation. The best parameter combination found in testing and used to generate the new parceling with the GA was the following: 320 as the generation number, a population of 40 individuals, 0.8 mutation tax, and a 0.3 renewal tax. The solution generated rather homogeneous lots in terms of productive capacity.
Resumo:
Films of isotropic nanocrystalline Pd(80)Co(20) alloys were obtained by electrodeposition onto brass substrate in plating baths maintained at different pH values. Increasing the pH of the plating bath led to an increase in mean grain size without inducing significant changes in the composition of the alloy. The magnetocrystalline anisotropy constant was estimated and the value was of the same order of magnitude as that reported for samples with perpendicular magnetic anisotropy. First order reversal curve (FORC) analysis revealed the presence of an important component of reversible magnetization. Also, FORC diagrams obtained at different sweep rate of the applied magnetic field, revealed that this reversible component is strongly affected by kinetic effect. The slight bias observed in the irreversible part of the FORC distribution suggested the dominance of magnetizing intergrain exchange coupling over demagnetizing dipolar interactions and microstructural disorder. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
Occupational exposure to respirable crystalline silica and to radiation emitted by natural radionuclides present both in rocks and sands was studied in the Brazilian extractive process and granite product manufacture. Respirable airborne dust samples were collected in working environments, where workers perform different tasks with distinct commercial granites types, and also in places where sandblasters work with sands from different origins. The free crystalline silica contents were determined using X-ray diffraction of the respirable particulate fraction of each sample. Dust samples from granite cutting and sandblasting ambient had the natural radionuclides concentrations measured by gamma spectrometry. Dust concentrations in the workplaces were quite variable, reaching values up to 10 times higher than the respirable particle mass threshold limit value (TLV) set by the American Conference for Governmental Industrial Hygienists of 3 mg m(-3). Also the free crystalline silica concentrations were high. reaching values up to 48 times the TLV of 0.025 mg m(-3). Additionally, our results suggest that the risk of radiation-induced cancer in the granite or marble industries is negligible. However, the combined exposure to dust, gamma radiation, and radon daughter products could result in the enhancement of lung cancer risks associated to sandblasting activities. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this work the results of a spectroscopic study of the southern field narrow-line Be star HD 171054 are presented. High dispersion and signal-to-noise ratio spectra allowed the estimation of the fundamental photospheric parameters such as the projected rotational velocity, effective temperature and superficial gravity from non-LTE stellar atmosphere models. From these parameters and microturbulence, the abundances of He, C, N, O, Mg, Al and Si for this object are estimated. Results show that C is depleted whereas N is overabundant compared with the sun and OB stars in the solar vicinity. Oxygen and helium are close to the solar value. Magnesium is down by 0.43 dex and aluminium and silicon are overabundant. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We analyze the stability properties of equilibrium solutions and periodicity of orbits in a two-dimensional dynamical system whose orbits mimic the evolution of the price of an asset and the excess demand for that asset. The construction of the system is grounded upon a heterogeneous interacting agent model for a single risky asset market. An advantage of this construction procedure is that the resulting dynamical system becomes a macroscopic market model which mirrors the market quantities and qualities that would typically be taken into account solely at the microscopic level of modeling. The system`s parameters correspond to: (a) the proportion of speculators in a market; (b) the traders` speculative trend; (c) the degree of heterogeneity of idiosyncratic evaluations of the market agents with respect to the asset`s fundamental value; and (d) the strength of the feedback of the population excess demand on the asset price update increment. This correspondence allows us to employ our results in order to infer plausible causes for the emergence of price and demand fluctuations in a real asset market. The employment of dynamical systems for studying evolution of stochastic models of socio-economic phenomena is quite usual in the area of heterogeneous interacting agent models. However, in the vast majority of the cases present in the literature, these dynamical systems are one-dimensional. Our work is among the few in the area that construct and study analytically a two-dimensional dynamical system and apply it for explanation of socio-economic phenomena.
Resumo:
An alternative method for determination of total trans fatty acids expressed as elaidic acid by capillary zone electrophoresis (CZE) under indirect UV detection at 224 nm within an analysis time of 7.5 min was developed. The optimized running electrolyte includes 15.0 mmol L(-1) KH(2)PO(4)/Na(2)HPO(4) buffer (pH similar to 7.0), 4.0 mmol L(-1) SDBS, 8.0 mmol L(-1) Brij35, 45%v/v ACN, 8% methanol, and 1.5% v/v n-octanol. Baseline separation of the critical pair C18-9cis/C18:1-9t: with a resolution higher than 1.5 was achieved using C15:0 as the internal standard. The optimum capillary electrophoresis (CE) conditions for the background electrolyte were established with the aid of Raman spectroscopy and experiments of a 3(2) factorial design. After response factor (R(F)) calculations, the CE method was applied to total trans fatty acid (TTFA) analysis in a hydrogenated vegetable fat (HVF) sample, and compared with the American Oil Chemists` Society (AOCS) official method by gas chromatography (GC). The methods were compared with an independent sample t test, and no significant difference was found between CE and GC methods within the 95% confidence interval for six genuine replicates of TTFA analysis (p-value > 0.05). The CE method was applied to TTFA analysis in a spreadable cheese sample. Satisfactory results were obtained, indicating that the optimized methodology can be used for trans fatty acid determination for these samples.
Resumo:
Differential Scanning Calorimetry (DSC), thermogravimetry/derivative thermogravimetry (TG/DTG) and infrared spectroscopy (IR) techniques were used to investigate the compatibility between prednicarbate and several excipients commonly used in semi solid pharmaceutical form. The thermoanalytical studies of 1:1 (m/m) drug/excipient physical mixtures showed that the beginning of the first thermal decomposition stage of the prednicarbate (T (onset) value) was decreased in the presence of stearyl alcohol and glyceryl stearate compared to the drug alone. For the binary mixture of drug/sodium pirrolidone carboxilate the first thermal decomposition stage was not changed, however the DTG peak temperature (T (peak DTG)) decreased. The comparison of the IR spectra of the drug, the physical mixtures and of the thermally treated samples confirmed the thermal decomposition of prednicarbate. By the comparison of the thermal profiles of 1:1 prednicarbate:excipients mixtures (methylparaben, propylparaben, carbomer 940, acrylate crosspolymer, lactic acid, light liquid paraffin, isopropyl palmitate, myristyl lactate and cetyl alcohol) no interaction was observed.
Resumo:
This work presents the use of sequential injection analysis (SIA) and the response surface methodology as a tool for optimization of Fenton-based processes. Alizarin red S dye (C.I. 58005) was used as a model compound for the anthraquinones family. whose pigments have a large use in coatings industry. The following factors were considered: [H(2)O(2)]:[Alizarin] and [H(2)O(2)]:[FeSO(4)] ratios and pH. The SIA system was designed to add reagents to the reactor and to perform on-line sampling of the reaction medium, sending the samples to a flow-through spectrophotometer for monitoring the color reduction of the dye. The proposed system fed the statistical program with degradation data for fast construction of response surface plots. After optimization, 99.7% of the dye was degraded and the TOC content was reduced to 35% of the original value. Low reagents consumption and high sampling throughput were the remarkable features of the SIA system. (C) 2008 Published by Elsevier B.V.
Resumo:
This work describes the development, electrochemical characterization and utilization of a cobalt phthalocyanine modified carbon nanotube electrode for the quantitative determination of dopamine in 0.2 mol L-1 phosphate buffer contaminated with high concentration of ascorbic acid. The electrode surface was analyzed by cyclic voltammetry and electrochemical impedance spectroscopy which showed a modified surface presenting a charge transfer resistance of 500 Omega, against the 16.46 k Omega value found for the bare glassy carbon surface. A pseudo rate constant value of 5.4 x 10(-4) cm s(-1) for dopamine oxidation was calculated. Voltammetric experiments showed a shift of the peak potential of DA oxidation to less positive value at 390 mV as compared with that of a bare GC electrode at 570 mV. The electrochemical determination of dopamine, in presence of ascorbic acid in concentrations up to 0.1 mol L-1 by differential pulse voltarnmetry, yielded a detection limit as low as 2.56 x 10(-7) mol L-1.
Resumo:
The pulp- and paper production is a very energy intensive industry sector. Both Sweden and the U.S. are major pulpandpaper producers. This report examines the energy and the CO2-emission connected with the pulp- and paperindustry for the two countries from a lifecycle perspective.New technologies make it possible to increase the electricity production in the integrated pulp- andpaper mill through black liquor gasification and a combined cycle (BLGCC). That way, the mill canproduce excess electricity, which can be sold and replace electricity produced in power plants. In thisprocess the by-products that are formed at the pulp-making process is used as fuel to produce electricity.In pulp- and paper mills today the technology for generating energy from the by-product in aTomlinson boiler is not as efficient as it could be compared to the BLGCC technology. Scenarios havebeen designed to investigate the results from using the BLGCC technique using a life cycle analysis.Two scenarios are being represented by a 1994 mill in the U.S. and a 1994 mill in Sweden.The scenariosare based on the average energy intensity of pulp- and paper mills as operating in 1994 in the U.S.and Sweden respectively. The two other scenarios are constituted by a »reference mill« in the U.S. andSweden using state-of-the-art technology. We investigate the impact of varying recycling rates and totalenergy use and CO2-emissions from the production of printing and writing paper. To economize withthe wood and that way save trees, we can use the trees that are replaced by recycling in a biomassgasification combined cycle (BIGCC) to produce electricity in a power station. This produces extra electricitywith a lower CO2 intensity than electricity generated by, for example, coal-fired power plants.The lifecycle analysis in this thesis also includes the use of waste treatment in the paper lifecycle. Both Sweden and theU.S. are countries that recycle paper. Still there is a lot of paper waste, this paper is a part of the countries municipalsolid waste (MSW). A lot of the MSW is landfilled, but parts of it are incinerated to extract electricity. The thesis hasdesigned special scenarios for the use of MSW in the lifecycle analysis.This report is studying and comparing two different countries and two different efficiencies on theBLGCC in four different scenarios. This gives a wide survey and points to essential parameters to specificallyreflect on, when making assumptions in a lifecycle analysis. The report shows that there arethree key parameters that have to be carefully considered when making a lifecycle analysis of wood inan energy and CO2-emission perspective in the pulp- and paper mill in the U.S. and in Sweden. First,there is the energy efficiency in the pulp- and paper mill, then the efficiency of the BLGCC and last theCO2 intensity of the electricity displaced by BIGCC or BLGCC generatedelectricity. It also show that with the current technology that we havetoday, it is possible to produce CO2 free paper with a waste paper amountup to 30%. The thesis discusses the system boundaries and the assumptions.Further and more detailed research, including amongst others thesystem boundaries and forestry, is recommended for more specificanswers.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
The aim of this paper is to develop a flexible model for analysis of quantitative trait loci (QTL) in outbred line crosses, which includes both additive and dominance effects. Our flexible intercross analysis (FIA) model accounts for QTL that are not fixed within founder lines and is based on the variance component framework. Genome scans with FIA are performed using a score statistic, which does not require variance component estimation. RESULTS: Simulations of a pedigree with 800 F2 individuals showed that the power of FIA including both additive and dominance effects was almost 50% for a QTL with equal allele frequencies in both lines with complete dominance and a moderate effect, whereas the power of a traditional regression model was equal to the chosen significance value of 5%. The power of FIA without dominance effects included in the model was close to those obtained for FIA with dominance for all simulated cases except for QTL with overdominant effects. A genome-wide linkage analysis of experimental data from an F2 intercross between Red Jungle Fowl and White Leghorn was performed with both additive and dominance effects included in FIA. The score values for chicken body weight at 200 days of age were similar to those obtained in FIA analysis without dominance. CONCLUSION: We have extended FIA to include QTL dominance effects. The power of FIA was superior, or similar, to standard regression methods for QTL effects with dominance. The difference in power for FIA with or without dominance is expected to be small as long as the QTL effects are not overdominant. We suggest that FIA with only additive effects should be the standard model to be used, especially since it is more computationally efficient.
Resumo:
A major problem in e-service development is the prioritization of the requirements of different stakeholders. The main stakeholders are governments and their citizens, all of whom have different and sometimes conflicting requirements. In this paper, the prioritization problem is addressed by combining a value-based approach with an illustration technique. This paper examines the following research question: How can multiple stakeholder requirements be illustrated from a value-based perspective in order to be prioritizable? We used an e-service development case taken from a Swedish municipality to elaborate on our approach. Our contributions are: 1) a model of the relevant domains for requirement prioritization for government, citizens, technology, finances and laws and regulations; and 2) a requirement fulfillment analysis tool (RFA) that consists of a requirement-goal-value matrix (RGV), and a calculation and illustration module (CIM). The model reduces cognitive load, helps developers to focus on value fulfillment in e-service development and supports them in the formulation of requirements. It also offers an input to public policy makers, should they aim to target values in the design of e-services.
Resumo:
In the field of Information and Communication Technologies for Development (ICT4D) ICT use in education is well studied. Education is often seen as a pre-requisite for development and ICTs are believed to aid in education, e.g. to make it more accessible and to increase its quality. In this paper we study the access and use of ICT in a study circle (SC) education program in the south coast of Kenya. The study is qualitative reporting results based on interviews and observations with SC participants, government officers and SC coordinators and teachers. The study builds on the capability approach perspective of development where individuals’ opportunities and ability to live a life that they value are focused. The aim of the study is to investigate the capability outcomes enabled through the capability inputs access and use of ICT in education as well as the factors that enabled and/or restricted the outcomes. Findings show that many opportunities have been enabled such as an increase in the ability to generate an income, learning benefits, community development and basic human development (e.g. literacy and self-confidence). However, conversion factors such as a poorly developed infrastructure and poor IT literacy prevent many of the individuals from taking full advantage of the ICT and the opportunities it enables.
Resumo:
Even though assessing social marketing endeavors proves to be challenging, evaluators can learn from previous campaigns and identify which facets of social marketing events, programs and campaigns need to be improved. Additionally, by analyzing social movements and evaluating how they connect to social marketing, we can gain a clearer view on ways to ameliorate the field of social marketing. As social marketing becomes increasingly sophisticated and similar to commercial marketing, there is hope that social marketing can yield higher rates of success in the future. Friend and Levy (2002) claimed that it was nearly impossible to compare social marketing endeavors using quantitative criteria and advocate the use of qualitative methods. However, if social marketing scholars developed a more systematic paradigm to assess events, programs and campaigns employing a combination of both quantitative and qualitative methods, then it would be easier to establish which social marketing efforts generated more success than others. When there are too many confounding variables, conclusions cannot always be drawn and evaluations may not be viewed as legitimate. As a result, critics become skeptical of social marketing’s value and both the importance and credibility of social marketing decline. With the establishment of proper criteria and evaluation methods, social marketing can progress and initiate more social change.