13 resultados para Critical chain method

em Helda - Digital Repository of University of Helsinki


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis consists of five international congress papers and a summary with an introduction. The overarching aim of the studies and the summary is to examine the inner coherency of the theological and anthropological thinking of Gregory of Nyssa (331-395). To the issue is applied an "apophatic approach" with a "Christological focus". It is suggested that the coherency is to be found from the Christological concept of unity between "true God" and "true man" in the one person of Jesus Christ. Gregory is among the first to make a full recognition of two natures of Christ, and to use this recognition systematically in his writings. The aim of the studies is pursued by the method of "identification", a combination of the modern critical "problematic method" and Gregory's own aphairetic method of "following" (akolouthia). The preoccupation with issues relating to the so-called Hellenization of Christianity in the patristic era was strong in the twentieth-century Gregory scholarship. The most discussed questions have been the Greek influence in his thought and his philosophical sources. In the five articles of the thesis it is examined how Gregory's thinking stands in its own right. The manifestly apophatic character of his theological thinking is made a part of the method of examining his thought according to the principles of his own method of following. The basic issue concerning the relation of theology and anthropology is discussed in the contexts of his central Trinitarian, anhtropological, Christological and eschatological sources. In the summary the Christocentric integration of Gregory's thinking is discussed also in relation to the issue of the alledged Hellenization. The main conclusion of the thesis concerns the concept of theology in Gregory. It is not indebted to the classical concept of theology as metaphysics or human speculation of God. Instead, it is founded to the traditional Judeo-Christian idea of God who speaks with his people face to face. In Gregory, theologia connotes the oikonomia of God's self-revelation. It may be regarded as the state of constant expression of love between the Creator and his created image. In theology, the human person becomes an image of the Word by which the Father expresses his love to "man" whom he loves as his own Son. Eventually the whole humankind, as one, gives the divine Word a physical - audible and sensible - Body. Humankind then becomes what theology is. The whole humanity expresses divine love by manifesting Christ in words and deeds, singing in one voice to the glory of the Father, the Son and the Holy Spirit.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the present study was to investigate the possibilities and interconnec-tions that exist concerning the relationship between the University of Applied Sci-ences and the Learning by Developing action model (LbD), on the one hand, and education for sustainable development and high-quality learning as a part of profes-sional competence development on the other. The research and learning environment was the Coping at Home research project and its Caring TV project, which provided the context of the Physiotherapy for Elderly People professional study unit. The re-searcher was a teacher and an evaluator of her own students learning. The aims of the study were to monitor and evaluate learning at the individual and group level using tools of high-quality learning − improved concept maps − related to understanding the projects core concept of successful ageing. Conceptions were evaluated through aspects of sustainable development and a conceptual basis of physiotherapy. As edu-cational research this was a multi-method case study design experiment. The three research questions were as follows. 1. What kind of individual conceptions and conceptual structures do students build concerning the concept of successful ageing? How many and what kind of concepts and propositions do they have a) before the study unit, b) after the study unit, c) after the social-knowledge building? 2. What kind of social-knowledge building exists? a) What kind of social learn-ing process exists? b) What kind of socially created concepts, propositions and conceptual structures do the students possess after the project? c) What kind of meaning does the social-knowledge building have at an individual level? 3. How do physiotherapy competences develop according to the results of the first and second research questions? The subjects were 22 female, third-year Bachelor of Physiotherapy students in Laurea University of Applied Sciences in Finland. Individual learning was evaluated in 12 of the 22 students. The data was collected as a part of the learning exercises of the Physiotherapy for Elderly People study unit, with improved concept maps both at individual and group levels. The students were divided into two social-knowledge building groups: the first group had 15 members and second 7 members. Each group created a group-level concept map on the theme of successful ageing. These face-to-face interactions were recorded with CMapTools and videotaped. The data consists of both individually produced concept maps and group-produced concept maps of the two groups and the videotaped material of these processes. The data analysis was carried out at the intersection of various research traditions. Individually produced data was analysed based on content analysis. Group-produced data was analysed based on content analysis and dialogue analysis. The data was also analysed by simple statistical analysis. In the individually produced improved concept maps the students conceptions were comprehensive, and the first concept maps were found to have many concepts unrelated to each other. The conceptual structures were between spoke structures and chain structures. Only a few professional concepts were evident. In the second indi-vidual improved concept maps the conception was more professional than earlier, particulary from the functional point of view. The conceptual structures mostly re-sembled spoke structures. After the second individual concept mapping social map-ping interventions were made in the two groups. After this, multidisciplinary concrete links were established between all concepts in almost all individual concept maps, and the interconnectedness of the concepts in different subject areas was thus understood. The conceptual structures were mainly net structures. The concepts in these individual concept maps were also found to be more professional and concrete than in the previ-ous concept maps of these subjects. In addition, the wider context dependency of the concepts was recognized in many individual concept maps. This implies a conceptual framework for specialists. The social-knowledge building was similar to a social learning process. Both socio-cultural processes and cognitive processes were found to develop students conceptual awareness and the ability to engage in intentional learning. In the knowl-edge-building process two aspects were found: knowledge creation and pedagogical action. The discussion during the concept-mapping process was similar to a shared thinking process. In visualising the process with CMapTools, students easily comple-mented each others thoughts and words, as if mutually telepathic . Synthesizing, supporting, asking and answering, peer teaching and counselling, tutoring, evaluating and arguing took place, and students were very active, self-directed and creative. It took hundreds of conversations before a common understanding could be found. The use of concept mapping in particular was very effective. The concepts in these group-produced concept maps were found to be professional, and values of sustainable development were observed. The results show the importance of developing the contents and objectives of the European Qualification Framework as well as education for sustainable development, especially in terms of the need for knowledge creation, global responsibility and systemic, holistic and critical thinking in order to develop clinical practice. Keywords: education for sustainable development, learning, knowledge building, improved concept map, conceptual structure, competence, successful ageing

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sustainability of food production has increasingly attracted the attention of consumers, farmers, food and retailing companies, and politicians. One manifestation of such attention is the growing interest in organic foods. Organic agriculture has the potential to enhance the ecological modernisation of food production by implementing the organic method as a preventative innovation that simultaneously produces environmental and economic benefits. However, in addition to the challenges to organic farming, the small market share of organic products in many countries today and Finland in particular risks undermining the achievement of such benefits. The problems identified as hindrances to the increased consumption of organic food are the poor availability, limited variety and high prices of organic products, the complicated buying decisions and the difficulties in delivering the intangible value of organic foods. Small volumes and sporadic markets, high costs, lack of market information, as well as poor supply reliability are obstacles to increasing the volume of organic production and processing. These problems shift the focus from a single actor to the entire supply chain and require solutions that involve more interaction among the actors within the organic chain. As an entity, the organic food chain has received very little scholarly attention. Researchers have mainly approached the organic chain from the perspective of a single actor, or they have described its structure rather than the interaction between the actors. Consequently, interaction among the primary actors in organic chains, i.e. farmers, manufacturers, retailers and consumers, has largely gone unexamined. The purpose of this study is to shed light on the interaction of the primary actors within a whole organic chain in relation to the ecological modernisation of food production. This information is organised into a conceptual framework to help illuminate this complex field. This thesis integrates the theories and concepts of three approaches: food system studies, supply chain management and ecological modernisation. Through a case study, a conceptual system framework will be developed and applied to a real life-situation. The thesis is supported by research published in four articles. All examine the same organic chains through case studies, but each approaches the problem from a different, complementary perspective. The findings indicated that regardless of the coherent values emphasising responsibility, the organic chains were loosely integrated to operate as a system. The focus was on product flow, leaving other aspects of value creation largely aside. Communication with consumers was rare, and none of the actors had taken a leading role in enhancing the market for organic products. Such a situation presents unsuitable conditions for ecological modernisation of food production through organic food and calls for contributions from stakeholders other than those directly involved in the product chain. The findings inspired a revision of the original conceptual framework. The revised framework, the three-layer framework , distinguishes the different layers of interaction. By gradually enlarging the chain orientation the different but interrelated layers become visible. A framework is thus provided for further research and for understanding practical implications of the performance of organic food chains. The revised framework provides both an ideal model for organic chains in relation to ecological modernisation and demonstrates a situation consistent with the empirical evidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetics, the science of heredity and variation in living organisms, has a central role in medicine, in breeding crops and livestock, and in studying fundamental topics of biological sciences such as evolution and cell functioning. Currently the field of genetics is under a rapid development because of the recent advances in technologies by which molecular data can be obtained from living organisms. In order that most information from such data can be extracted, the analyses need to be carried out using statistical models that are tailored to take account of the particular genetic processes. In this thesis we formulate and analyze Bayesian models for genetic marker data of contemporary individuals. The major focus is on the modeling of the unobserved recent ancestry of the sampled individuals (say, for tens of generations or so), which is carried out by using explicit probabilistic reconstructions of the pedigree structures accompanied by the gene flows at the marker loci. For such a recent history, the recombination process is the major genetic force that shapes the genomes of the individuals, and it is included in the model by assuming that the recombination fractions between the adjacent markers are known. The posterior distribution of the unobserved history of the individuals is studied conditionally on the observed marker data by using a Markov chain Monte Carlo algorithm (MCMC). The example analyses consider estimation of the population structure, relatedness structure (both at the level of whole genomes as well as at each marker separately), and haplotype configurations. For situations where the pedigree structure is partially known, an algorithm to create an initial state for the MCMC algorithm is given. Furthermore, the thesis includes an extension of the model for the recent genetic history to situations where also a quantitative phenotype has been measured from the contemporary individuals. In that case the goal is to identify positions on the genome that affect the observed phenotypic values. This task is carried out within the Bayesian framework, where the number and the relative effects of the quantitative trait loci are treated as random variables whose posterior distribution is studied conditionally on the observed genetic and phenotypic data. In addition, the thesis contains an extension of a widely-used haplotyping method, the PHASE algorithm, to settings where genetic material from several individuals has been pooled together, and the allele frequencies of each pool are determined in a single genotyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sulfotransferases (SULTs) and UDP-glucuronosyltransferases (UGTs) are important detoxification enzymes and they contribute to bioavailability and elimination of many drugs. SULT1A3 is an extrahepatic enzyme responsible for the sulfonation of dopamine, which is often used as its probe substrate. A new method for analyzing dopamine-3-O-sulfate and dopamine-4-O-sulfate by high-performance liquid chromatography was developed and the enzyme kinetic parameters for their formation were determined using purified recombinant human SULT1A3. The results show that SULT1A3 strongly favors the 3-hydroxy group of dopamine, which indicates that it may be the major enzyme responsible for the difference between the circulating levels of dopamine sulfates in human blood. All 19 known human UGTs were expressed as recombinant enzymes in baculovirus infected insect cells and their activities toward dopamine and estradiol were studied. UGT1A10 was identified as the only UGT capable of dopamine glucuronidation at a substantial level. The results were supported by studies with human intestinal and liver microsomes. The affinity was low indicating that UGT1A10 is not an important enzyme in dopamine metabolism in vivo. Despite the low affinity, dopamine is a potential new probe substrate for UGT1A10 due to its selectivity. Dopamine was used to study the importance of phenylalanines 90 and 93 in UGT1A10. The results revealed distinct effects that are dependent on differences in the size of the side chain and on the differences in their position within the protein. Examination of twelve mutants revealed lower activity in all of them. However, the enzyme kinetic studies of four mutants showed that their affinities were similar to that of UGT1A10 suggesting that F90 and F93 are not directly involved in dopamine binding in the active site. The glucuronidation of β-estradiol and epiestradiol (α-estradiol) was studied to elucidate how the orientation of the 17-OH group affects conjugation at the 3-OH or the 17-OH of either diastereomer. The results show that there are clear differences in the regio- and stereoselectivities of UGTs. The most active isoforms were UGT1A10 and UGT2B7 demonstrating opposite regioselectivity. The stereoselectivities of UGT2Bs were more complex than those of UGT1As. The amino acid sequences of the human UGTs 1A9 and 1A10 are 93% identical, yet there are large differences in their activity and substrate selectivity. Several mutants were constructed to identify the residues responsible for the activity differences. The results revealed that the residues between Leu86 and Tyr176 of UGT1A9 determine the differences between UGT1A9 and UGT1A10. Phe117 of UGT1A9 participated in 1-naphthol binding and the residues at positions 152 and 169 contributed to the higher glucuronidation rates of UGT1A10. In summary, the results emphasize that the substrate selectivities, including regio- and stereoselectivities, of UGTs are complex and they are controlled by many amino acids rather than one critical residue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical incidents have had an important role in service quality and service management research. The focus of critical-incident studies has gradually shifted from separate acts and episodes towards relationships, and even switching from one relationship to another. The Critical Incident Technique has mainly been used when studying the service sector, concentrating on the customer's perception of critical incidents. Although some studies have considered the perceptions of employees important, critical incidents have not been considered a tool for studying internal relationships to any larger extent. This paper takes a process approach and shifts the focus from an external to an internal setting. It puts forward a new technique for analysing internal relationships from a critical-incident perspective. The technique captures the dynamism in relationships through considering internal critical incidents as micro-processes affecting not only internal but also external relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug induced liver injury is one of the frequent reasons for the drug removal from the market. During the recent years there has been a pressure to develop more cost efficient, faster and easier ways to investigate drug-induced toxicity in order to recognize hepatotoxic drugs in the earlier phases of drug development. High Content Screening (HCS) instrument is an automated microscope equipped with image analysis software. It makes the image analysis faster and decreases the risk for an error caused by a person by analyzing the images always in the same way. Because the amount of drug and time needed in the analysis are smaller and multiple parameters can be analyzed from the same cells, the method should be more sensitive, effective and cheaper than the conventional assays in cytotoxicity testing. Liver cells are rich in mitochondria and many drugs target their toxicity to hepatocyte mitochondria. Mitochondria produce the majority of the ATP in the cell through oxidative phosphorylation. They maintain biochemical homeostasis in the cell and participate in cell death. Mitochondria is divided into two compartments by inner and outer mitochondrial membranes. The oxidative phosphorylation happens in the inner mitochondrial membrane. A part of the respiratory chain, a protein called cytochrome c, activates caspase cascades when released. This leads to apoptosis. The aim of this study was to implement, optimize and compare mitochondrial toxicity HCS assays in live cells and fixed cells in two cellular models: human HepG2 hepatoma cell line and rat primary hepatocytes. Three different hepato- and mitochondriatoxic drugs (staurosporine, rotenone and tolcapone) were used. Cells were treated with the drugs, incubated with the fluorescent probes and then the images were analyzed using Cellomics ArrayScan VTI reader. Finally the results obtained after optimizing methods were compared to each other and to the results of the conventional cytotoxicity assays, ATP and LDH measurements. After optimization the live cell method and rat primary hepatocytes were selected to be used in the experiments. Staurosporine was the most toxic of the three drugs and caused most damage to the cells most quickly. Rotenone was not that toxic, but the results were more reproducible and thus it would serve as a good positive control in the screening. Tolcapone was the least toxic. So far the conventional analysis of cytotoxicity worked better than the HCS methods. More optimization needs to be done to get the HCS method more sensitive. This was not possible in this study due to time limit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of revascularization procedures for critical limb ischaemia (CLI) is to preserve the leg and sustain the patient s ambulatory status. Other goals are ischaemic pain relief and healing of ischaemic ulcers. Patients with CLI are usually old and have several comorbidities affecting the outcome. Revascularization for CLI is meaningless unless both life and limb are preserved. Therefore, the knowledge of both patient- and bypass-related risk factors is of paramount importance in clinical decision-making, patient selection and resource allocation. The aim of this study was to identify patient- and graft-related predictors of impaired outcome after infrainguinal bypass for CLI. The purpose was to assess the outcome of high-risk patients undergoing infrainguinal bypass and to evaluate the usefulness of specific risk scoring methods. The results of bypasses in the absence of optimal vein graft material were also evaluated, and the feasibility of the new method of scaffolding suboptimal vein grafts was assessed. The results of this study showed that renal insufficiency - not only renal failure but also moderate impairment in renal function - seems to be a significant risk factor for both limb loss and death after infrainguinal bypass in patients with CLI. Low estimated GFR (PIENEMPI KUIN 30 ml/min/1.73 m2) is a strong independent marker of poor prognosis. Furthermore, estimated GFR is a more accurate predictor of survival and leg salvage after infrainguinal bypass in CLI patients than serum creatinine level alone. We also found out that the life expectancy of octogenarians with CLI is short. In this patient group endovascular revascularization is associated with a better outcome than bypass in terms of survival, leg salvage and amputation-free survival especially in presence of coronary artery disease. This study was the first one to demonstrate that Finnvasc and modified Prevent III risk scoring methods both predict the long-term outcome of patients undergoing both surgical and endovascular infrainguinal revascularization for CLI. Both risk scoring methods are easy to use and might be helpful in clinical practice as an aid in preoperative patient selection and decision-making. Similarly than in previous studies, we found out that a single-segment great saphenous vein graft is superior to any other autologous vein graft in terms of mid-term patency and leg salvage. However, if optimal vein graft is lacking, arm vein conduits are superior to prosthetic grafts especially in infrapopliteal bypasses for CLI. We studied also the new method of scaffolding suboptimal quality vein grafts and found out that this method may enable the use of vein grafts of compromised quality otherwise unsuitable for bypass grafting. The remarkable finding was that patients with the combination of high operative risk due to severe comorbidities and risk graft have extremely poor survival, suggesting that only relatively fit patients should undergo complex bypasses with risk grafts. The results of this study can be used in clinical practice as an aid in preoperative patient selection and decision-making. In the future, the need of vascular surgery will increase significantly as the elderly and diabetic population increases, which emphasises the importance of focusing on those patients that will gain benefit from infrainguinal bypass. Therefore, the individual risk of the patient, ambulatory status, outcome expectations, the risk of bypass procedure as well as technical factors such as the suitability of outflow anatomy and the available vein material should all be assessed and taken into consideration when deciding on the best revascularization strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Competition is an immensely important area of study in economic theory, business and strategy. It is known to be vital in meeting consumers’ growing expectations, stimulating increase in the size of the market, pushing innovation, reducing cost and consequently generating better value for end users, among other things. Having said that, it is important to recognize that supply chains, as we know it, has changed the way companies deal with each other both in confrontational or conciliatory terms. As such, with the rise of global markets and outsourcing destinations, increased technological development in transportation, communication and telecommunications has meant that geographical barriers of distance with regards to competition are a thing of the past in an increasingly flat world. Even though the dominant articulation of competition within management and business literature rests mostly within economic competition theory, this thesis draws attention to the implicit shift in the recognition of other forms of competition in today’s business environment, especially those involving supply chain structures. Thus, there is popular agreement within a broad business arena that competition between companies is set to take place along their supply chains. Hence, management’s attention has been focused on how supply chains could become more aggressive making each firm in its supply chain more efficient. However, there is much disagreement on the mechanism through which such competition pitching supply chain against supply chain will take place. The purpose of this thesis therefore, is to develop and conceptualize the notion of supply chain vs. supply chain competition, within the discipline of supply chain management. The thesis proposes that competition between supply chains may be carried forward via the use of competition theories that emphasize interaction and dimensionality, hence, encountering friction from a number of sources in their search for critical resources and services. The thesis demonstrates how supply chain vs. supply chain competition may be carried out theoretically, using generated data for illustration, and practically using logistics centers as a way to provide a link between theory and corresponding practice of this evolving competition mode. The thesis concludes that supply chain vs. supply chain competition, no matter the conceptualization taken, is complex, novel and can be very easily distorted and abused. It therefore calls for the joint development of regulatory measures by practitioners and policymakers alike, to guide this developing mode of competition.