909 resultados para Wine and wine making Analysis
Resumo:
Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.
Resumo:
How does a society less than two decades after a liberation war which involved large sections of the population come to terms with the memories of violence and war — a war in which there was no clear distinction between insurgent and counter‐insurgent, liberator and oppressor and in which the majority of the casualties can be found among the rural civilian population? This was a predicament not exclusive to Zimbabwe, but one which also applies to Mozambique, South Africa and, more recently, to Rwanda. Since its independence Zimbabwe has been a prime example of successful reconciliation. Ranger has argued that spiritual healing has contributed importantly to coming to terms with the trauma of war through turning violence into history. Here it will be argued that an analysis of the intersections between memories of violence, healing, and history reveals a twofold process. Social healing is made possible by a shift from conviction and compensation to revealing without convicting. At the same time healing provides an arena for communities in which competing and contesting memories of violence are renegotiated. Through these processes sense is being made of the past; history is being made.
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
The general objective of this research is to explore theories and methodologies of sustainability indicators, environmental management and decision making disciplines with the operational purpose of producing scientific, robust and relevant information for supporting system understanding and decision making in real case studies. Several tools have been applied in order to increase the understanding of socio-ecological systems as well as providing relevant information on the choice between alternatives. These tools have always been applied having in mind the complexity of the issues and the uncertainty tied to the partial knowledge of the systems under study. Two case studies with specific application to performances measurement (environmental performances in the case of the K8 approach and sustainable development performances in the case of the EU Sustainable Development Strategy) and a case study about the selection of sustainable development indicators amongst Municipalities in Scotland, are discussed in the first part of the work. In the second part of the work, the common denominator among subjects consists in the application of spatial indices and indicators to address operational problems in land use management within the territory of the Ravenna province (Italy). The main conclusion of the thesis is that a ‘perfect’ methodological approach which always produces the best results in assessing sustainability performances does not exist. Rather, there is a pool of correct approaches answering different evaluation questions, to be used when methodologies fit the purpose of the analysis. For this reason, methodological limits and conceptual assumptions as well as consistency and transparency of the assessment, become the key factors for assessing the quality of the analysis.
VERIFICATION OF DNA PREDICTED PROTEIN SEQUENCES BY ENZYME HYDROLYSIS AND MASS SPECTROMETRIC ANALYSIS
Resumo:
The focus of this thesis lies in the development of a sensitive method for the analysis of protein primary structure which can be easily used to confirm the DNA sequence of a protein's gene and determine the modifications which are made after translation. This technique involves the use of dipeptidyl aminopeptidase (DAP) and dipeptidyl carboxypeptidase (DCP) to hydrolyze the protein and the mass spectrometric analysis of the dipeptide products.^ Dipeptidyl carboxypeptidase was purified from human lung tissue and characterized with respect to its proteolytic activity. The results showed that the enzyme has a relatively unrestricted specificity, making it useful for the analysis of the C-terminal of proteins. Most of the dipeptide products were identified using gas chromatography/mass spectrometry (GC/MS). In order to analyze the peptides not hydrolyzed by DCP and DAP, as well as the dipeptides not identified by GC/MS, a FAB ion source was installed on a quadrupole mass spectrometer and its performance evaluated with a variety of compounds.^ Using these techniques, the sequences of the N-terminal and C-terminal regions and seven fragments of bacteriophage P22 tail protein have been verified. All of the dipeptides identified in these analysis were in the same DNA reading frame, thus ruling out the possibility of a single base being inserted or deleted from the DNA sequence. The verification of small sequences throughout the protein sequence also indicates that no large portions of the protein have been removed after translation. ^
Resumo:
Schizophrenia (SZ) is a complex disorder with high heritability and variable phenotypes that has limited success in finding causal genes associated with the disease development. Pathway-based analysis is an effective approach in investigating the molecular mechanism of susceptible genes associated with complex diseases. The etiology of complex diseases could be a network of genetic factors and within the genes, interaction may occur. In this work we argue that some genes might be of small effect that by itself are neither sufficient nor necessary to cause the disease however, their effect may induce slight changes to the gene expression or affect the protein function, therefore, analyzing the gene-gene interaction mechanism within the disease pathway would play crucial role in dissecting the genetic architecture of complex diseases, making the pathway-based analysis a complementary approach to GWAS technique. ^ In this study, we implemented three novel linkage disequilibrium based statistics, the linear combination, the quadratic, and the decorrelation test statistics, to investigate the interaction between linked and unlinked genes in two independent case-control GWAS datasets for SZ including participants of European (EA) and African (AA) ancestries. The EA population included 1,173 cases and 1,378 controls with 729,454 genotyped SNPs, while the AA population included 219 cases and 288 controls with 845,814 genotyped SNPs. We identified 17,186 interacting gene-sets at significant level in EA dataset, and 12,691 gene-sets in AA dataset using the gene-gene interaction method. We also identified 18,846 genes in EA dataset and 19,431 genes in AA dataset that were in the disease pathways. However, few genes were reported of significant association to SZ. ^ Our research determined the pathways characteristics for schizophrenia through the gene-gene interaction and gene-pathway based approaches. Our findings suggest insightful inferences of our methods in studying the molecular mechanisms of common complex diseases.^
Resumo:
This research focused on the topic of end-of-life planning and decision-making for adults affected by mental retardation. Adults with mental retardation have unique challenges in this regard, including difficulty communicating their wishes without assistance and diminished decision-making skills. The primary research objective was to identify factors that can affect opportunities for adults with mental retardation in community-based services settings (and their advocates) to be involved in planning and deciding about their own end-of-life experience. ^ A descriptive qualitative inquiry was designed to explore issues related to death and dying, and the notion of end-of-life planning, from the perspective of adults with mental retardation who receive publicly-funded community services ("clients") and family members of individuals who receive such services. Study participants were recruited from a single mental retardation service provider in a large urban setting (the "Agency"). Sixteen clients and 14 families of Agency clients took part. Client data collection was accomplished through face-to-face interviews, focus group meetings, and record reviews; family members were involved in a face-to-face interview only. ^ An initial coding scheme was developed based upon literature and policy reviews, and themes related to the research questions. Analysis involved extracting data from transcripts and records and placing it into appropriate thematic categories, building support for each theme with the accumulated data. Coding themes were modified to accommodate new data when it challenged existing themes. ^ Findings suggest that adults with mental retardation do have the requisite knowledge, interest, and ability to participate in decisions about their end-of-life experience and handling of affairs. Siblings are overwhelmingly the chosen future surrogates and they (or their children) will likely be the end-of-life advocates for their brothers and sisters affected by mental retardation. Findings further point to a need for increased awareness, accurate information, and improved communication about end-of-life issues, both in general and particular to adults affected by mental retardation. Also suggested by the findings is a need to focus on creating accommodations and adaptations that can best uncover a person's authentic views on life and death and related end-of-life preferences. Practical implications and suggestions for further research are also discussed. ^
Resumo:
The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demostrated using experimental data obtained on osmotic dehydratation of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses), namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality). Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP) and the Tabular Method (TM), were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.
Resumo:
En las últimas tres décadas, las dinámicas de restructuración económica a nivel global han redefinido radicalmente el papel de las ciudades. La transición del keynesianismo al neoliberalismo ha provocado un cambio en las políticas urbanas de los gobiernos municipales, que han abandonado progresivamente las tareas de regulación y redistribución para centrarse en la promoción del crecimiento económico y la competitividad. En este contexto, muchas voces críticas han señalado que la regeneración urbana se ha convertido en un vehículo de extracción de valor de la ciudad y está provocando la expulsión de los ciudadanos más vulnerables. Sin embargo, la regeneración de áreas consolidadas supone también una oportunidad de mejora de las condiciones de vida de la población residente, y es una política necesaria para controlar la expansión de la ciudad y reducir las necesidades de desplazamiento, promoviendo así ciudades más sostenibles. Partiendo de la hipótesis de que la gobernanza de los procesos de regeneración urbana es clave en el resultado final de las operaciones y determina el modelo de ciudad resultante, el objetivo de esta investigación es verificar si la regeneración urbana es necesariamente un mecanismo de extracción de valor o si puede mejorar la calidad de vida en las ciudades a través de la participación de los ciudadanos. Para ello, propone un marco de análisis del proceso de toma de decisiones en los planes de regeneración urbana y su impacto en los resultados de los planes, tomando como caso de estudio la ciudad de Boston, que desde los años 1990 trata de convertirse en una “ciudad de los barrios”, fomentando la participación ciudadana al tiempo que se posiciona en la escena económica global. El análisis se centra en dos operaciones de regeneración iniciadas a finales de los años 1990. Por un lado, el caso de Jackson Square nos permite comprender el papel de la sociedad civil y el tercer sector en la regeneración de los barrios más desfavorecidos, en un claro ejemplo de urbanismo “desde abajo” (bottom-up planning). Por otro, la reconversión del frente marítimo de South Boston para la construcción del Distrito de Innovación nos acerca a las grandes operaciones de regeneración urbana con fines de estímulo económico, tradicionalmente vinculadas a los centros financieros (downtown) y dirigidas por las élites gubernamentales y económicas (la growth machine) a través de procesos más tecnocráticos (top-down planning). La metodología utilizada consiste en el análisis cualitativo de los procesos de toma de decisiones y la relación entre los agentes implicados, así como de la evaluación de la implementación de dichas decisiones y su influencia en el modelo urbano resultante. El análisis de los casos permite afirmar que la gobernanza de los procesos de regeneración urbana influye decisivamente en el resultado final de las intervenciones; sin embargo, la participación de la comunidad local en la toma de decisiones no es suficiente para que el resultado de la regeneración urbana contrarreste los efectos de la neoliberalización, especialmente si se limita a la fase de planeamiento y no se extiende a la fase de ejecución, y si no está apoyada por una movilización política de mayor alcance que asegure una acción pública redistributiva. Asimismo, puede afirmarse que los procesos de regeneración urbana suponen una redefinición del modelo de ciudad, dado que la elección de los espacios de intervención tiene consecuencias sobre el equilibrio territorial de la ciudad. Los resultados de esta investigación tienen implicaciones para la disciplina del planeamiento urbano. Por una parte, se confirma la vigencia del paradigma del “urbanismo negociado”, si bien bajo discursos de liderazgo público y sin apelación al protagonismo del sector privado. Por otra parte, la planificación colaborativa en un contexto de “responsabilización” de las organizaciones comunitarias puede desactivar la potencia política de la participación ciudadana y servir como “amortiguador” hacia el gobierno local. Asimismo, la sustitución del planeamiento general como instrumento de definición de la ciudad futura por una planificación oportunista basada en la actuación en áreas estratégicas que tiren del resto de la ciudad, no permite definir un modelo coherente y consensuado de la ciudad que se desea colectivamente, ni permite utilizar el planeamiento como mecanismo de redistribución. ABSTRACT In the past three decades, the dynamics of global economic restructuring have radically redefined the role of cities. The transition from keynesianism to neoliberalism has caused a shift in local governments’ urban policies, which have progressively abandoned the tasks of regulation and redistribution to focus on promoting economic growth and competitiveness. In this context, many critics have pointed out that urban regeneration has become a vehicle for extracting value from the city and is causing the expulsion of the most vulnerable citizens. However, regeneration of consolidated areas is also an opportunity to improve the living conditions of the resident population, and is a necessary policy to control the expansion of the city and reduce the need for transportation, thus promoting more sustainable cities. Assuming that the governance of urban regeneration processes is key to the final outcome of the plans and determines the resulting city model, the goal of this research is to verify whether urban regeneration is necessarily a value extraction mechanism or if it can improve the quality of life in cities through citizens’ participation. It proposes a framework for analysis of decision-making in urban regeneration processes and their impact on the results of the plans, taking as a case study the city of Boston, which since the 1990s is trying to become a "city of neighborhoods", encouraging citizen participation, while seeking to position itself in the global economic scene. The analysis focuses on two redevelopment plans initiated in the late 1990s. The Jackson Square case allows us to understand the role of civil society and the third sector in the regeneration of disadvantaged neighborhoods, in a clear example of bottom-up planning. On the contrary, the conversion of the South Boston waterfront to build the Innovation District takes us to the big redevelopment efforts with economic stimulus’ goals, traditionally linked to downtowns and led by government and economic elites (the local “growth machine”) through more technocratic processes (top-down planning). The research is based on a qualitative analysis of the processes of decision making and the relationship between those involved, as well as the evaluation of the implementation of those decisions and their influence on the resulting urban model. The analysis suggests that the governance of urban regeneration processes decisively influences the outcome of interventions; however, community engagement in the decision-making process is not enough for the result of the urban regeneration to counteract the effects of neoliberalization, especially if it is limited to the planning phase and does not extend to the implementation of the projects, and if it is not supported by a broader political mobilization to ensure a redistributive public action. Moreover, urban regeneration processes redefine the urban model, since the choice of intervention areas has important consequences for the territorial balance of the city. The results of this study have implications for the discipline of urban planning. On the one hand, it confirms the validity of the "negotiated planning" paradigm, albeit under public leadership discourse and without a direct appeal to the leadership role of the private sector. On the other hand, collaborative planning in a context of "responsibilization" of community based organizations can deactivate the political power of citizen participation and serve as a "buffer" towards the local government. Furthermore, the replacement of comprehensive planning, as a tool for defining the city's future, by an opportunistic planning based on intervention in strategic areas that are supposed to induce change in the rest of the city, does not allow a coherent and consensual urban model that is collectively desired, nor it allows to use planning as a redistribution mechanism.
Resumo:
An individual faced with intergroup conflict chooses A from a vast array of possible actions, ranging from grumbling among ingroup friends to voting and demonstrating to rioting and revolution. The present paper conceptualises these intergroup choices as rationally shaped by perceptions of the benefits and costs associated with the action (expectancy-value processes). However, in presenting a model of agentic normative influence, it is argued that in intergroup contexts group-level costs and benefits play a critical role in individuals' decision-making. In the context of English-French conflict in Quebec, in Canada, four studies provide evidence that group-level costs and benef influence individuals' decision-making in intergro conflict; that the individual level of analysis need mediate the group level of analysis; that group-level co and benefits mediate the relationship between soc identity and intentions to engage in collective action; a that perceptions of outgroup and ingroup norms for inte group behaviours are relatively invariant and predictal related to perceptions of the group- and individual-le, benefits and costs associated with individualistic vers collective actions. By modelling the relationship betwe group norms and group-level costs and benefits, soc psychologists may begin to address the processes th underlie identity-behaviour relationships in collecti action and intergroup conflict.
Resumo:
This research project has developed a novel decision support system using Geographical Information Systems and Multi Criteria Decision Analysis and used it to develop and evaluate energy-from-waste policy options. The system was validated by applying it to the UK administrative areas of Cornwall and Warwickshire. Different strategies have been defined by the size and number of the facilities, as well as the technology chosen. Using sensitivity on the results from the decision support system, it was found that key decision criteria included those affected by cost, energy efficiency, transport impacts and air/dioxin emissions. The conclusions of this work are that distributed small-scale energy-from-waste facilities score most highly overall and that scale is more important than technology design in determining overall policy impact. This project makes its primary contribution to energy-from-waste planning by its development of a Decision Support System that can be used to assist waste disposal authorities to identify preferred energy-from-waste options that have been tailored specifically to the socio-geographic characteristics of their jurisdictional areas. The project also highlights the potential of energy-from-waste policies that are seldom given enough attention to in the UK, namely those of a smaller-scale and distributed nature that often have technology designed specifically to cater for this market.
Resumo:
Over the past fifteen years, an interconnected set of regulatory reforms, knownas Better Regulation, has been adopted across Europe, marking a significant shift in theway that European Union policies are developed. There has been little exploration of the origins of these reforms, which include mandatory ex ante impact assessment. Drawing on documentary and interview data, this article discusses how and why large corporations, notably British American Tobacco (BAT), worked to influence and promote these reforms. Our analysis highlights (1) howpolicy entrepreneurs with sufficient resources (such as large corporations) can shape the membership and direction of advocacy coalitions; (2) the extent to which "think tanks" may be prepared to lobby on behalf of commercial clients; and (3) why regulated industries (including tobacco) may favor the use of "evidence tools," such as impact assessments, in policy making. We argue that a key aspect of BAT's ability to shape regulatory reform involved the deliberate construction of a vaguely defined idea that could be strategically adapted to appeal to diverse constituencies.We discuss the theoretical implications of this finding for the Advocacy Coalition Framework, as well as the practical implications of the findings for efforts to promote transparency and public health in the European Union.
Resumo:
In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.