902 resultados para Application method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop a crowdsourced videographic research method for consumer culture research. Videography provides opportunities for expressing contextual and culturally embedded relations. Thus, developing new ways to conduct videographic research is meaningful. This study develops the crowdsourced videographic method based on a literature review and evaluation of a focal study. The literature review follows a qualitative systematic review process. Through the literature review, based on different methodological, crowdsourcing and consumer research related literature, this study defines the method, its application process and evaluation criteria. Furthermore, the evaluation of the focal study, where the method was applied, completes the study. This study applies professional review with self-evaluation as a form of evaluation, drawing from secondary data including research task description, screenshots of the mobile application used in the focal study, videos collected from the participants, and self-evaluation by the author. The focal study is analyzed according to its suitability to consumer culture research, research process and quality. Definitions and descriptions of the research method, its process and quality criteria form the theoretical contribution of this study. Evaluating the focal study using these definitions underlines some best practices of this type of research, generating the practical contribution of this study. Finally, this study provides ideas for future research. First, defining the boundaries of the use of crowdsourcing in various parts of conducting research. Second, improving the method by applying it to new research contexts. Third, testing how changes in one dimension of the crowdsourcing models interact with other dimension. Fourth, comparing the quality criteria applied in this study to various other quality criteria to improve the method’s usefulness. Overall, this study represents a starting point for further development of the crowdsourced videographic research method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis the design of bandpass filters tunable at 400 MHz – 800 MHz was under research. Microwave filters are vital components which provide frequency selectivity in wide variety of electronic systems operating at high frequencies. Due to the occurrence of multi-frequency bands communication and diverse applications of wireless devices, requirement of tunable filters exists. The one of potential implementation of frequency-agile filters is frontends and spectrum sensors in Cognitive Radio (CR). The principle of CR is to detect and operate at a particular available spectrum without interfering with the primary user’s signals. This new method allows improving the efficiency of utilizing allocated spectrum such as TV band (400 MHz – 800 MHz). The focus of this work is development of sufficiently compact, low cost tunable filters with quite narrow bandwidth using currently available lumped-element components and PCB board technology. Filter design, different topologies and methods of tuning of bandpass filters are considered in this work. As a result, three types of topologies of bandpass filter were simulated and realised. They use digitally tunable capacitors (DTCs) for adjusting central frequency at TV "white space" spectrum. Measurements revealed that schematics presented in this work have proper output response and filters are successfully tuned by DTCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electrocardiography (ECG) QT interval is influenced by fluctuations in heart rate (HR) what may lead to misinterpretation of its length. Considering that alterations in QT interval length reflect abnormalities of the ventricular repolarisation which predispose to occurrence of arrhythmias, this variable must be properly evaluated. The aim of this work is to determine which method of correcting the QT interval is the most appropriate for dogs regarding different ranges of normal HR (different breeds). Healthy adult dogs (n=130; German Shepherd, Boxer, Pit Bull Terrier, and Poodle) were submitted to ECG examination and QT intervals were determined in triplicates from the bipolar limb II lead and corrected for the effects of HR through the application of three published formulae involving quadratic, cubic or linear regression. The mean corrected QT values (QTc) obtained using the diverse formulae were significantly different (ρ<0.05), while those derived according to the equation QTcV = QT + 0.087(1- RR) were the most consistent (linear regression). QTcV values were strongly correlated (r=0.83) with the QT interval and showed a coefficient of variation of 8.37% and a 95% confidence interval of 0.22-0.23 s. Owing to its simplicity and reliability, the QTcV was considered the most appropriate to be used for the correction of QT interval in dogs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mathematica system (version 4.0) is employed in the solution of nonlinear difusion and convection-difusion problems, formulated as transient one-dimensional partial diferential equations with potential dependent equation coefficients. The Generalized Integral Transform Technique (GITT) is first implemented for the hybrid numerical-analytical solution of such classes of problems, through the symbolic integral transformation and elimination of the space variable, followed by the utilization of the built-in Mathematica function NDSolve for handling the resulting transformed ODE system. This approach ofers an error-controlled final numerical solution, through the simultaneous control of local errors in this reliable ODE's solver and of the proposed eigenfunction expansion truncation order. For covalidation purposes, the same built-in function NDSolve is employed in the direct solution of these partial diferential equations, as made possible by the algorithms implemented in Mathematica (versions 3.0 and up), based on application of the method of lines. Various numerical experiments are performed and relative merits of each approach are critically pointed out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlorophyll fluorescence is currently used as a rapid diagnostic and nondestructive method to detect and quantify damage on the photosynthetic apparatus of leaves on weeds, crops and ornamental/coniferous trees in response to both environmental stress and herbicides. This study aimed to evaluate chlorophyll fluorescence in guanandi plants (Calophyllum brasiliense) after application of different postemergence herbicides. The experiment was performed in a completely randomized design, with six treatments (control, bentazon, sulfentrazone, isoxaflutole, atrazine and glyphosate) and five replications. The herbicide treatments were applied with a stationary sprayer, and electron transport rate (ETR) was subsequently analyzed with OS5p Multi-Mode Chlorophyll Fluorometer. In the monitored period, guanandi plants subjected to atrazine showed higher sensitivity to chlorophyll fluorescence than the other treatments. Although bentazon is a photosystem II inhibitor, it showed no major changes in electron transport for the studied species and in the monitored period. In summary, ETR is a good parameter to evaluate the effect of some herbicides on Calophyllum brasiliense plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real option valuation, in particular the fuzzy pay-off method, has proven to be useful in defining risk and visualizing imprecision of investments in various industry applications. This study examines whether the evaluation of risk and profitability for public real estate investments can be improved by using real option methodology. Firstly, the context of real option valuation in the real estate industry is examined. Further, an empirical case study is performed on 30 real estate investments of a Finnish government enterprise in order to determine whether the presently used investment analysis system can be complemented by the pay-off method. Despite challenges in the application of the pay-off method to the case company’s large investment base, real option valuation is found to create additional value and facilitate more robust risk analysis in public real estate applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloning of the T-cell receptor genes is a critical step when generating T-cell receptor transgenic mice. Because T-cell receptor molecules are clonotypical, isolation of their genes requires reverse transcriptase-assisted PCR using primers specific for each different Valpha or Vß genes or by the screening of cDNA libraries generated from RNA obtained from each individual T-cell clone. Although feasible, these approaches are laborious and costly. The aim of the present study was to test the application of the non-palindromic adaptor-PCR method as an alternative to isolate the genes encoding the T-cell receptor of an antigen-specific T-cell hybridoma. For this purpose, we established hybridomas specific for trans-sialidase, an immunodominant Trypanosoma cruzi antigen. These T-cell hybridomas were characterized with regard to their ability to secrete interferon-gamma, IL-4, and IL-10 after stimulation with the antigen. A CD3+, CD4+, CD8- interferon-gamma-producing hybridoma was selected for the identification of the variable regions of the T-cell receptor by the non-palindromic adaptor-PCR method. Using this methodology, we were able to rapidly and efficiently determine the variable regions of both T-cell receptor chains. The results obtained by the non-palindromic adaptor-PCR method were confirmed by the isolation and sequencing of the complete cDNA genes and by the recognition with a specific antibody against the T-cell receptor variable ß chain. We conclude that the non-palindromic adaptor-PCR method can be a valuable tool for the identification of the T-cell receptor transcripts of T-cell hybridomas and may facilitate the generation of T-cell receptor transgenic mice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple experimental protocol applying a quantitative ultrasound (QUS) pulse-echo technique was used to measure the acoustic parameters of healthy femoral diaphyses of Wistar rats in vivo. Five quantitative parameters [apparent integrated backscatter (AIB), frequency slope of apparent backscatter (FSAB), time slope of apparent backscatter (TSAB), integrated reflection coefficient (IRC), and frequency slope of integrated reflection (FSIR)] were calculated using the echoes from cortical and trabecular bone in the femurs of 14 Wistar rats. Signal acquisition was performed three times in each rat, with the ultrasound signal acquired along the femur's central region from three positions 1 mm apart from each other. The parameters estimated for the three positions were averaged to represent the femur diaphysis. The results showed that AIB, FSAB, TSAB, and IRC values were statistically similar, but the FSIR values from Experiments 1 and 3 were different. Furthermore, Pearson's correlation coefficient showed, in general, strong correlations among the parameters. The proposed protocol and calculated parameters demonstrated the potential to characterize the femur diaphysis of rats in vivo. The results are relevant because rats have a bone structure very similar to humans, and thus are an important step toward preclinical trials and subsequent application of QUS in humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wood-based bioprocesses present one of the fields of interest with the most potential in the circular economy. Expanding the use of wood raw material in sustainable industrial processes is acknowledged on both a global and a regional scale. This thesis concerns the application of a capillary zone electrophoresis (CZE) method with the aim of monitoring wood-based bioprocesses. The range of detectable carbohydrate compounds is expanded to furfural and polydatin in aquatic matrices. The experimental portion has been conducted on a laboratory scale with samples imitating process samples. This thesis presents a novel strategy for the uncertainty evaluation via in-house validation. The focus of the work is on the uncertainty factors of the CZE method. The CZE equipment is sensitive to ambient conditions. Therefore, a proper validation is essential for robust application. This thesis introduces a tool for process monitoring of modern bioprocesses. As a result, it is concluded that the applied CZE method provides additional results to the analysed samples and that the profiling approach is suitable for detecting changes in process samples. The CZE method shows significant potential in process monitoring because of the capability of simultaneously detecting carbohydrate-related compound clusters. The clusters can be used as summary terms, indicating process variation and drift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparative analysis of the theoretical-experimental study, developed by Hsu on the hydration of Amsoy 71 soybean grain, was performed through several soaking experiments using CD 202 soybean at 10, 20, 30, 40, and 50 °C, measuring moisture content over time. The results showed that CD 202 soybean equilibrium moisture content, Xeq, does not depend on temperature and is 21% higher than that found by Hsu, suggesting that soybean cultivar exerts great influence on Xeq. The Hsu model was numerically solved and its parameters were adjusted by the least squares method, with maximum deviations of +/- 10% relative to the experimental values. The limiting step in the mass transfer process during hydration corresponds to water diffusion inside the grain, leading to radial moisture gradients that decrease over time and with an increase in temperature. Regardless of the soybean cultivar, diffusivity increases as temperature or moisture content increases. However, the values of this transport property for Amsoy 71 were superior to those of CD 202, very close at the beginning of hydration at 20 °C and almost three times higher at the end of hydration at 50 °C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid mixtures for refreshment are already totally integrated to the Brazilian consumers' daily routine, because of their quick preparation method, yield and reasonable price - quite lower if compared to 'ready-to-drink' products or products for prompt consumption, what makes them economically more accessible to low-income populations. Within such a context, the aim of this work was to evaluate the physicochemical and mineral composition, as well as the hygroscopic behavior of four different brands of solid mixture for mango refreshment. The BET, GAB, Oswim and Henderson mathematical models were built through the adjustment of experimental data to the isotherms of adsorption. Results from the physiochemical evaluation showed that the solid mixtures for refreshments are considerable sources of ascorbic acid and reductor sugar; and regarding mineral compounds, they are significant sources of calcium, sodium and potassium. It was also verified that the solid mixtures for refreshments of the four studied brands are considered highly hygroscopic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research was to use the technique of Exploratory Factor Analysis (EFA) for the adequacy of a tool for the assessment of fish consumption and the characteristics involved in this process. Data were collected during a campaign to encourage fish consumption in Brazil with the voluntarily participation of members of a university community. An assessment instrument consisting of multiple-choice questions and a five-point Likert scale was designed and used to measure the importance of certain attributes that influence the choice and consumption of fish. This study sample was composed of of 224 individuals, the majority were women (65.6%). With regard to the frequency of fish consumption, 37.67% of the volunteers interviewed said they consume the product two or three times a month, and 29.6% once a week. The Exploratory Factor Analysis (EFA) was used to group the variables; the extraction was made using the principal components and the rotation using the Quartimax method. The results show clusters in two main constructs, quality and consumption with Cronbach Alpha coefficients of 0.75 and 0.69, respectively, indicating good internal consistency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fish industry generates high volume of waste from fish oil that can have the extraction of its lipids used as nutraceuticals and foods. The objective of this study was to produce unsaturated fatty acids from industrialized fish oil by means of a differentiated hydrolysis process. The samples used were crude fish oil obtained from Campestre industry and characterized through physical-chemical parameters, according to AOCS: acidity, peroxide, saponification, iodine and percentage of free fatty acids and also obtained the fatty acid profile through derivatization method for gas chromatography. The results obtained for the oleochemical indices for refined oil were similar to the data found on the literature. The content of polyunsaturated fatty acids (PUFA) was found of 32,78%, with 9,12% of docosahexaenoic (DHA) and 10,36% of eicosapentaenoic (EPA), regarding monounsaturated fatty acids (MUFA) content was of 30,59% in the hydrolyzed fish oil in relation to refined (20,06%). Thus, it can be concluded that the hydrolysis process used for oils from fish-waste was satisfactory on the production of absolute yield of lipids in the process and significant preservation on the percentages of EPA and DHA, interesting on the production of nutraceuticals and nutrition of aquatic animals, including shrimp in captivity.