59 resultados para Complex products
Resumo:
Non-timber forest products (NTFPs) are one of the major income sources for the rural population of Laos. An exploratory study was conducted to determine the role of non-timber forest products for rural communities of the study area. The study was carried out in two villages viz. Ban Napo and Ban Kouay of Sangthong district between January and March 2010. A semi-structured questionnaire was used to gather data from the respondents. Twenty-five respondents from each village were chosen based on their involvement in NTFPs collection and marketing activities. Statistically significant NTFPs income differences were not found between the villages and age groups of the respondents, however, significant differences were found in the annual incomes between farms size of the respondents. This study also analyzed the value chain structure of the three (See khai’ ton, Bamboo mats and Incense sticks) important non-timber forest products and the interactions between the actors in the case study areas. Barriers to entry the market, governance and upgrading possibilities have been discussed for each of the value chains. Comparison of unit prices at different levels of the value chains indicated uneven income distribution in favour of the intermediaries, factories and foreign buyers. The lack of capital, marketing information and negotiation skills restricted the villagers to increase their income. However, all the respondents have shown their satisfaction with their income from NTFPs.
Resumo:
A better understanding of stock price changes is important in guiding many economic activities. Since prices often do not change without good reasons, searching for related explanatory variables has involved many enthusiasts. This book seeks answers from prices per se by relating price changes to their conditional moments. This is based on the belief that prices are the products of a complex psychological and economic process and their conditional moments derive ultimately from these psychological and economic shocks. Utilizing information about conditional moments hence makes it an attractive alternative to using other selective financial variables in explaining price changes. The first paper examines the relation between the conditional mean and the conditional variance using information about moments in three types of conditional distributions; it finds that the significance of the estimated mean and variance ratio can be affected by the assumed distributions and the time variations in skewness. The second paper decomposes the conditional industry volatility into a concurrent market component and an industry specific component; it finds that market volatility is on average responsible for a rather small share of total industry volatility — 6 to 9 percent in UK and 2 to 3 percent in Germany. The third paper looks at the heteroskedasticity in stock returns through an ARCH process supplemented with a set of conditioning information variables; it finds that the heteroskedasticity in stock returns allows for several forms of heteroskedasticity that include deterministic changes in variances due to seasonal factors, random adjustments in variances due to market and macro factors, and ARCH processes with past information. The fourth paper examines the role of higher moments — especially skewness and kurtosis — in determining the expected returns; it finds that total skewness and total kurtosis are more relevant non-beta risk measures and that they are costly to be diversified due either to the possible eliminations of their desirable parts or to the unsustainability of diversification strategies based on them.
Resumo:
The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.
Resumo:
Campylobacter, mainly Campylobacter jejuni and C. coli, are worldwide recognized as a major cause of bacterial food-borne gastroenteritis (World Health Organization 2010). Epidemiological studies have shown handling or eating of poultry to be significant risk factors for human infections. Campylobacter contamination can occur at all stages of a poultry meat production cycle. In summer 1999, every broiler flock from all three major Finnish poultry slaughterhouses was studied during a five month period. Caecal samples were taken in the slaughterhouses from five birds per flock. A total of 1 132 broiler flocks were tested and 33 (2.9%) of those were Campylobacter-positive. Thirty-one isolates were identified as C. jejuni and two isolates were C. coli. The isolates were serotyped for heat-stable antigens (HS) and genotyped by pulsed-field gel electrophoresis (PFGE). The most common serotypes found were HS 6,7, 12 and 4-complex. Using a combination of SmaI and KpnI patterns, 18 different PFGE types were identified. Thirty-five Finnish C. jejuni strains with five SmaI/SacII PFGE types selected among human and chicken isolates from 1997 and 1998 were used for comparison of their PFGE patterns, amplified fragment length polymorphism (AFLP) patterns, HaeIII ribotypes, and HS serotypes. The discriminatory power of PFGE, AFLP and ribotyping with HaeIII were shown to be at the same level for this selected set of strains, and these methods assigned the strains into the same groups. The PFGE and AFLP patterns within a genotype were highly similar, indicating genetic relatedness. An HS serotype was distributed among different genotypes, and different serotypes were identified within one genotype. From one turkey parent flock, the hatchery, six different commercial turkey farms (together 12 flocks) and from 11 stages at the slaughterhouse a total of 456 samples were collected during one and the half year. For the detection of Campylobacter both conventional culture and a PCR method were used. No Campylobacter were detected in either of the samples from the turkey parent flock or from the hatchery samples using the culture method. Instead PCR detected DNA of Campylobacter in five faecal samples from the turkey parent flock and in one fluff and an eggshell sample. Six out of 12 commercial turkey flocks were found negative at the farm level but only two of those were negative at slaughter. Campylobacter-positive samples within the flock at slaughter were detected between 0% and 94%, with evisceration and chilling water being the most critical stages for contamination. All of a total of 121 Campylobacter isolates were shown to be C. jejuni using a multiplex PCR assay. PFGE analysis of all isolates with KpnI restriction enzyme resulted in 11 PFGE types (I-XI) and flaA-SVR typing yielded nine flaA-SVR alleles. Three Campylobacter-positive turkey flocks were colonized by a limited number of Campylobacter genotypes both at the farm and slaughter level.In conclusion, in our first study in 1999 a low prevalence of Campylobacter in Finnish broiler flocks was detected and it has remained at a low level during the study period until the present. In the turkey meat production, we found that flocks which were negative at the farm became contaminated with Campylobacter at the slaughter process. These results suggest that proper and efficient cleaning and disinfection of slaughter and processing premises are needed to avoid cross-contamination. Prevention of colonization at the farm by a high level of biosecurity control and hygiene may be one of the most efficient ways to reduce the amount of Campylobacter-positive poultry meat in Finland. In Finland, with a persistent low level of Campylobacter-positive flocks, it could be speculated that the use of logistic slaughtering, according to Campylobacter status at farm, might have be advantageous in reducing Campylobacter contamination of retail poultry products. However, the significance of the domestic poultry meat for human campylobacteriosis in Finland should be evaluated.
Resumo:
In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.
Resumo:
People with coeliac disease have to maintain a gluten-free diet, which means excluding wheat, barley and rye prolamin proteins from their diet. Immunochemical methods are used to analyse the harmful proteins and to control the purity of gluten-free foods. In this thesis, the behaviour of prolamins in immunological gluten assays and with different prolamin-specific antibodies was examined. The immunoassays were also used to detect residual rye prolamins in sourdough systems after enzymatic hydrolysis and wheat prolamins after deamidation. The aim was to characterize the ability of the gluten analysis assays to quantify different prolamins in varying matrices in order to improve the accuracy of the assays. Prolamin groups of cereals consist of a complex mixture of proteins that vary in their size and amino acid sequences. Two common characteristics distinguish prolamins from other cereal proteins. Firstly, they are soluble in aqueous alcohols, and secondly, most of the prolamins are mainly formed from repetitive amino acid sequences containing high amounts of proline and glutamine. The diversity among prolamin proteins sets high requirements for their quantification. In the present study, prolamin contents were evaluated using enzyme-linked immunosorbent assays based on ω- and R5 antibodies. In addition, assays based on A1 and G12 antibodies were used to examine the effect of deamidation on prolamin proteins. The prolamin compositions and the cross-reactivity of antibodies with prolamin groups were evaluated with electrophoretic separation and Western blotting. The results of this thesis research demonstrate that the currently used gluten analysis methods are not able to accurately quantify barley prolamins, especially when hydrolysed or mixed in oats. However, more precise results can be obtained when the standard more closely matches the sample proteins, as demonstrated with barley prolamin standards. The study also revealed that all of the harmful prolamins, i.e. wheat, barley and rye prolamins, are most efficiently extracted with 40% 1-propanol containing 1% dithiothreitol at 50 °C. The extractability of barley and rye prolamins was considerably higher with 40% 1-propanol than with 60% ethanol, which is typically used for prolamin extraction. The prolamin levels of rye were lowered by 99.5% from the original levels when an enzyme-active rye-malt sourdough system was used for prolamin degradation. Such extensive degradation of rye prolamins suggest the use of sourdough as a part of gluten-free baking. Deamidation increases the diversity of prolamins and improves their solubility and ability to form structures such as emulsions and foams. Deamidation changes the protein structure, which has consequences for antibody recognition in gluten analysis. According to the resuts of the present work, the analysis methods were not able to quantify wheat gluten after deamidation except at very high concentrations. Consequently, deamidated gluten peptides can exist in food products and remain undetected, and thus cause a risk for people with gluten intolerance. The results of this thesis demonstrate that current gluten analysis methods cannot accurately quantify prolamins in all food matrices. New information on the prolamins of rye and barley in addition to wheat prolamins is also provided in this thesis, which is essential for improving gluten analysis methods so that they can more accurately quantify prolamins from harmful cereals.
Resumo:
Thunderstorm is a dangerous electrical phenomena in the atmosphere. Thundercloud is formed when thermal energy is transported rapidly upwards in convective updraughts. Electrification occurs in the collisions of cloud particles in the strong updraught. When the amount of charge in the cloud is large enough, electrical breakdown, better known as a flash, occurs. Lightning location is nowadays an essential tool for the detection of severe weather. Located flashes indicate in real time the movement of hazardous areas and the intensity of lightning activity. Also, an estimate for the flash peak current can be determined. The observations can be used in damage surveys. The most simple way to represent lightning data is to plot the locations on a map, but the data can be processed in more complex end-products and exploited in data fusion. Lightning data serves as an important tool also in the research of lightning-related phenomena, such as Transient Luminous Events. Most of the global thunderstorms occur in areas with plenty of heat, moisture and tropospheric instability, for example in the tropical land areas. In higher latitudes like in Finland, the thunderstorm season is practically restricted to the summer season. Particular feature of the high-latitude climatology is the large annual variation, which regards also thunderstorms. Knowing the performance of any measuring device is important because it affects the accuracy of the end-products. In lightning location systems, the detection efficiency means the ratio between located and actually occurred flashes. Because in practice it is impossible to know the true number of actually occurred flashes, the detection efficiency has to be esimated with theoretical methods.
Resumo:
The purpose of this master´s thesis is to analyze how NATO Secretary General Anders Fogh Rasmussen is trying to justify the existence of the military alliance through the use of security arguments. I am puzzled by the question: why does NATO still exist – what is NATO’s raison d'être. The New Strategic Concept (2010) forms the base for his argumentation. This thesis focuses on the security argumentation of NATO which is examined by analyzing the speeches the Secretary General. The theoretical framework of this study is based on constructivist approach to international security examining the linguistic process of securitization. Issues become securitized after Anders Fogh Rasmussen names them as threats. This thesis focuses on the securitization process relating to NATO and analyses what issues Rasmussen raises to the security agenda. Research data consists of the speeches by Anders Fogh Rasmussen. They are analyzed through J.L. Austin’s speech act taxonomy and Chaïm Perelman’s argumentation theories. The thesis will concentrate on the formulation and articulation of these threats which are considered and coined as “new threats” in contemporary international relations. I am conducting this research through the use of securitization theory. This study illustrates that the threats are constructed by NATO’s member-states in unison, but the resolutions are sounded through Rasmussen’s official speeches and transcripts. . Based on the analysis it can be concluded that Rasmussen is giving reasons for the existence of NATO. This takes place by making use of speech acts and different rhetorical techniques. The results of the analysis indicate that NATO remains an essential organization for the West and the rest of the world according to the Secretary General.
Resumo:
The removal of noncoding sequences, or introns, from the eukaryotic messenger RNA precursors is catalyzed by a ribonucleoprotein complex known as the spliceosome. In most eukaryotes, two distinct classes of introns exist, each removed by a specific type of spliceosome. The major, U2-type introns account for over 99 % of all introns, and are almost ubiquitous. The minor, U12-type introns are found in most but not all eukaryotes, and reside in conserved locations in a specific set of genes. Due to their slow excision rates, the U12-type introns are expected to be involved in the regulation of the genes containing them by inhibiting the maturation of the messenger RNAs. However, little information is currently available on how the activity of the U12-dependent spliceosome itself is regulated. The levels of many known splicing factors are regulated through unproductive alternative splicing events, which lead to inclusion of premature STOP codons, targeting the transcripts for destruction by the nonsense-mediated decay pathway. These alternative splice sites are typically found in highly conserved sequence elements, which also contain binding sites for factors regulating the activation of the splice sites. Often, the activation is achieved by binding of products of the gene in question, resulting in negative feedback loops. In this study, I show that U11-48K, a protein factor specific to the minor spliceosome, specifically recognizes the U12-type 5' splice site sequence, and is essential for proper function of the minor spliceosome. Furthermore, the expression of U11-48K is regulated through a feedback mechanism, which functions through conserved sequence elements that activate alternative splicing and nonsense-mediated decay. This mechanism is conserved from plants to animals, highlighting both the importance and early origin of this mechanism in regulating splicing factors. I also show that the feedback regulation of U11-48K is counteracted by a component of the major spliceosome, the U1 small nuclear ribonucleoprotein particle, as well as members of the hnRNP F/H protein family. These results thus suggest that the feedback mechanism is finely tuned by multiple factors to achieve precise control of the activity of the U12-dependent spliceosome.
Resumo:
The purpose of this study was to find out whether food-related lifestyle guides and explains product evaluations, specifically, consumer perceptions and choice evaluations of five different food product categories: lettuce, mincemeat, savoury sauce, goat cheese, and pudding. The opinions of consumers who shop in neighbourhood stores were considered most valuable. This study applies means-end chain (MEC) theory, according to which products are seen as means by which consumers attain meaningful goals. The food-related lifestyle (FRL) instrument was created to study lifestyles that reflect these goals. Further, this research has adopted the view that the FRL functions as a script which guides consumer behaviour. Two research methods were used in this study. The first was the laddering interview, the primary aim of which was to gather information for formulating the questionnaire of the main study. The survey consisted of two separate questionnaires. The first was the FRL questionnaire modified for this study. The aim of the other questionnaire was to determine the choice criteria for buying five different categories of food products. Before these analyses could be made, several data modifications were made following MEC analysis procedures. Beside forming FRL dimensions by counting sum-scores from the FRL statements, factor analysis was run in order to elicit latent factors underlying the dimensions. The lifestyle factors found were adventurous, conscientious, enthusiastic, snacking, moderate, and uninvolved lifestyles. The association analyses were done separately for each choice of product as well as for each attribute-consequence linkage with a non-parametric Mann-Whitney U test. The testing variables were FRL dimensions and the FRL lifestyle factors. In addition, the relation between the attribute-consequence linkages and the demographic variables were analysed. Results from this study showed that the choice of product is sequential, so that consumers first categorize products into groups based on specific criteria like health or convenience. It was attested that the food-related lifestyles function as a script in food choice and that the FRL instrument can be used to predict consumer buying behaviour. Certain lifestyles were associated with the choice of each product category. The actual product choice within a product category then appeared to be a different matter. In addition, this study proposes a modification to the FRL instrument. The positive towards advertising FRL dimension was modified to examine many kinds of information search including the internet, TV, magazines, and other people. This new dimension, which was designated as being open to additional information, proved to be very robust and reliable in finding differences in consumer choice behaviour. Active additional information search was linked to adventurous and snacking food-related lifestyles. The results of this study support the previous knowledge that consumers expect to get many benefits simultaneously when they buy food products. This study brought detailed information about the benefits sought, the combination of benefits differing between products and between respondents. Household economy, pleasure and quality were emphasized with the choice of lettuce. Quality was the most significant benefit in choosing mincemeat, but health related benefits were often evaluated as well. The dominant benefits linked to savoury sauce were household economic benefits, expected pleasurable experiences, and a lift in self-respect. The choice of goat cheese appeared not to be an economic decision, self-respect, pleasure, and quality being included in the choice criteria. In choosing pudding, the respondents considered the well-being of family members, and indulged their family members or themselves.