876 resultados para Writing in the style of an author
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
Acknowledgment The first two authors wish to express their sincerest thanks to Iran National Science Foundation (INSF) for supporting this work under Contract Number 92021291.
Resumo:
This exhibition was a research presentation of works made at Center for Land Use Interpretation [CLUI]base in Wendover, Utah, USA between 2008-2010. The project was commissioned by the Centre For Land Interpretation in USA and funded by The Henry Moore Foundation in the UK. Documentation of research conducted in the field as made available as video and installation. An experimental discourse on the preservation of land art was put with GPS drawings and research information displayed as maps and documents. In examining physical sites in Utah, USA, the project connected to contemporary discourse centred on archives in relation to land art and land use. Using experimental processes conceived in relation to key concepts such as event structures and entropy, conceptual frameworks developed by Robert Smithson (USA) and John Latham (UK), the 'death drive' of the archive was examined in the context of a cultural impulse to preserve iconic works. The work took items from Lathams archive and placed them at the canonical 'Spiral Jetty', Smithson land art work at Rozel Point north of Salt Lake City. This became a focus for the project that also highlighted the role of the Getty Foundation in documenting major public artworks and CLUI in creating an American Land Museum. Work was created in the field at extreme remote locations using GPS technologies and visual tools were developed to articulate the concepts of the artists discussed, to engage the exhibition audience in ideas of transformation and entropy in art. Audiences were encouraged to sign a petition to be used in future preservation of spiral jetty currently facing development challenges.
Resumo:
A variety of conservation policies now frame the management of fishing activity and so do also the spatial planning of different sectorial activities. These framework policies are additional to classical fishery management. There is a risk that the policies applying on the marine system are not coherent from a fisheries point of view. The spatial management of fishing activity at regional scale has the potential to meet multiple management objectives, on a habitat basis. Here we consider how to integrate multiple objectives of different policies into integrated ocean management scenarios. In the EU, European Directives and the CFP are now implementing the ecosystem approach to the management of human activity at sea. In this context, we further identify three research needs: • Develop Management Strategy Evaluation (MSE) for multiple-objective and multiple-sector spatial management schemes • Improve knowledge on and evaluation of functional habitats • Develop spatially-explicit end-to-end models with appropriate complexity for spatial MSE The contribution is based on the results of a workshop of the EraNet COFASP.
Resumo:
Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014
Resumo:
The dissertation starts by providing a description of the phenomena related to the increasing importance recently acquired by satellite applications. The spread of such technology comes with implications, such as an increase in maintenance cost, from which derives the interest in developing advanced techniques that favor an augmented autonomy of spacecrafts in health monitoring. Machine learning techniques are widely employed to lay a foundation for effective systems specialized in fault detection by examining telemetry data. Telemetry consists of a considerable amount of information; therefore, the adopted algorithms must be able to handle multivariate data while facing the limitations imposed by on-board hardware features. In the framework of outlier detection, the dissertation addresses the topic of unsupervised machine learning methods. In the unsupervised scenario, lack of prior knowledge of the data behavior is assumed. In the specific, two models are brought to attention, namely Local Outlier Factor and One-Class Support Vector Machines. Their performances are compared in terms of both the achieved prediction accuracy and the equivalent computational cost. Both models are trained and tested upon the same sets of time series data in a variety of settings, finalized at gaining insights on the effect of the increase in dimensionality. The obtained results allow to claim that both models, combined with a proper tuning of their characteristic parameters, successfully comply with the role of outlier detectors in multivariate time series data. Nevertheless, under this specific context, Local Outlier Factor results to be outperforming One-Class SVM, in that it proves to be more stable over a wider range of input parameter values. This property is especially valuable in unsupervised learning since it suggests that the model is keen to adapting to unforeseen patterns.
Resumo:
Plusieurs auteurs (Nadon, 2007; Tauveron, 2005; Routman, 2010) ont mis de l’avant des propositions didactiques pour enseigner l’écriture de façon optimale à partir de la littérature de jeunesse, notamment en amenant les élèves à s’inspirer du style d’un auteur. Puisque la littérature de jeunesse est encore peu employée pour induire des situations d’écriture au primaire (Montésinos-Gelet et Morin, 2007), cette recherche présente un dispositif novateur, soit l’écriture à la manière d’un auteur qui consiste à placer l’élève dans une situation d’appropriation-observation d’une oeuvre littéraire dans le but d’en ressortir ses caractéristiques et de l’imiter (Geist, 2005 et Tauveron, 2002). Selon Olness (2007), l’exposition à une littérature de jeunesse de qualité est essentielle pour permettre aux élèves d’apprendre une variété de styles et d’éléments littéraires. Cette recherche a pour but de décrire dix séquences d’écriture à la manière d’un auteur conçues par l’enseignante-chercheuse et d’identifier les impacts de celles-ci, auprès des élèves, sur leurs habiletés en production écrite, de compréhension en lecture et sur leur motivation à l’écriture. Cette recherche a été réalisée pendant une période de 5 mois auprès de 18 élèves d’une classe de 2e année du primaire. Il ressort de cette recherche que les élèves ont grandement développé leur capacité à analyser et imiter les caractéristiques d’un texte source et qu’ils ont transféré ces apprentissages au-delà du contexte de notre recherche. Par la pratique fréquente et le modelage, ils ont assimilés les six traits de l’écriture et ont manifesté un intérêt grandissant envers la littérature de jeunesse.
Resumo:
Decoding emotional prosody is crucial for successful social interactions, and continuous monitoring of emotional intent via prosody requires working memory. It has been proposed by Ross and others that emotional prosody cognitions in the right hemisphere are organized in an analogous fashion to propositional language functions in the left hemisphere. This study aimed to test the applicability of this model in the context of prefrontal cortex working memory functions. BOLD response data were therefore collected during performance of two emotional working memory tasks by participants undergoing fMRI. In the prosody task, participants identified the emotion conveyed in pre-recorded sentences, and working memory load was manipulated in the style of an N-back task. In the matched lexico-semantic task, participants identified the emotion conveyed by sentence content. Block-design neuroimaging data were analyzed parametrically with SPM5. At first, working memory for emotional prosody appeared to be right-lateralized in the PFC, however, further analyses revealed that it shared much bilateral prefrontal functional neuroanatomy with working memory for lexico-semantic emotion. Supplementary separate analyses of males and females suggested that these language functions were less bilateral in females, but their inclusion did not alter the direction of laterality. It is concluded that Ross et al.'s model is not applicable to prefrontal cortex working memory functions, that evidence that working memory cannot be subdivided in prefrontal cortex according to material type is increased, and that incidental working memory demands may explain the frontal lobe involvement in emotional prosody comprehension as revealed by neuroimaging studies. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
The purpose of this study was to examine, in the context of an economic model of health production, the relationship between inputs (health influencing activities) and fitness.^ Primary data were collected from 204 employees of a large insurance company at the time of their enrollment in an industrially-based health promotion program. The inputs of production included medical care use, exercise, smoking, drinking, eating, coronary disease history, and obesity. The variables of age, gender and education known to affect the production process were also examined. Two estimates of fitness were used; self-report and a physiologic estimate based on exercise treadmill performance. Ordinary least squares and two-stage least squares regression analyses were used to estimate the fitness production functions.^ In the production of self-reported fitness status the coefficients for the exercise, smoking, eating, and drinking production inputs, and the control variable of gender were statistically significant and possessed theoretically correct signs. In the production of physiologic fitness exercise, smoking and gender were statistically significant. Exercise and gender were theoretically consistent while smoking was not. Results are compared with previous analyses of health production. ^
Resumo:
We describe the development of a capture enzyme-linked immunosorbent assay for the detection of the dengue virus nonstructural protein NS1. The assay employs rabbit polyclonal and monoclonal antibodies as the capture and detection antibodies, respectively. Immunoaffinity-purified NS1 derived from dengue 2 virus-infected cells was used as a standard to establish a detection sensitivity of approximately 4 ng/ml for an assay employing monoclonal antibodies recognizing a dengue 2 serotype-specific epitope. A number of serotype cross-reactive monoclonal antibodies were also shown to be suitable probes for the detection of NS1 expressed by the remaining three dengue virus serotypes. Examination of clinical samples demonstrated that the assay was able to detect NS1 with minimal interference from serum components at the test dilutions routinely used, suggesting that it could form the basis of a useful additional diagnostic test for dengue virus infection. Furthermore, quantitation of NS1 levels in patient sera may prove to be a valuable surrogate marker for viremia. Surprisingly high levels of NS1, as much as 15 mu g/ml, were found in acute-phase sera taken hom some of the patients experiencing serologically confirmed dengue 2 virus secondary infections but was not detected in the convalescent sera of these patients. In contrast, NS1 could not be detected in either acute-phase or convalescent serum samples taken from patients with serologically confirmed primary infection. The presence of high levels of secreted NS1 in the sera of patients experiencing secondary dengue virus infections, and in the context of an anamnestic antibody response, suggests that NS1 may contribute significantly to the formation of the circulating immune complexes that are suspected to play an important role in the pathogenesis of severe dengue disease.
Resumo:
This article is concerned primarily with an examination and comparison of select aspects of the model international consumer protection laws proposed by the United Nations (UN), the European Union (EU), and the Organisation for Economic Co-operation and Development (OECD), using the Trade Practices Act 1974 (Australia) as a basis for examination and comparison. As a secondary consideration, it also broadly examines the content of, and differences between, the model laws. The motive for this article is that any future enforceable international consumer protection regime (possibly in the form of an international treaty or convention) would need to take into account the UN, EU and OECD guidelines. A cross-comparison of those model laws, and a comparison of them with the consumer protection provisions of a well established national consumer protection law, should provide a useful starting point for the development of such a regime. The 'select aspects' of the model laws in question are the various provisions of those laws which could relate to situations involving the wrong delivery or non-delivery of goods.
Electromagnetic tracker feasibility in the design of a dental superstructure for edentulous patients
Resumo:
The success of the osseointegration concept and the Brånemark protocol is highly associated to the accuracy in the production of an implant-supported prosthesis. One of most critical steps for long-term success of these prosthesis is the accuracy obtained during the impression procedure, which is affected by factors such as the impression material, implant position, angulation and depth. This paper investigates the feasibility of 3D electromagnetic motion tracking systems as an acquisition method for modeling full-arch implant-supported prosthesis. To this extent, we propose an implant acquisition method at the patient mouth and a calibration procedure, based on a 3D electromagnetic tracker that obtains combined measurements of implant’s position and angulation, eliminating the use of any impression material. Three calibration algorithms (namely linear interpolation, higher-order polynomial and Hardy multiquadric) were tested to compensate for the electromagnetic tracker distortions introduced by the presence of nearby metals. Moreover, implants from different suppliers were also tested to study its impact on tracking accuracy. The calibration methodology and the algorithms employed proved to implement a suitable strategy for the evaluation of novel dental impression techniques. However, in the particular case of the evaluated electromagnetic tracking system, the order of magnitude of the obtained errors invalidates its use for the full-arch modeling of implant-supported prosthesis.
Resumo:
This essay analyses how the different types of memory may influence the process of identity formation. It shall be argued that not only memories formed upon the subject’s experiences play a key role in this process; intermediated, received narratives from the past, memories transmitted either symbolically or by elder members of the group, or, what has been meanwhile termed as “postmemory”, also play an important part in the development of an individual’s identitary map. This theoretical framework will be illustrated with the novelistic work of Austrian Israeli-born historian, writer and political activist Doron Rabinovici (*1961). As a representative of the so-called “second generation” of Holocaust writers, a generation of individuals who did not experience the nazi genocide violence, but who had to form their identities under the shadow of such a brutal past, Rabinovici addresses essential topics such as the intergenerational transmission of memory and guilt within survivor families, identity formation of second generation individuals (Jews and non-Jews) and the question of simultaneously belonging to different social, historical and linguistic contexts.
Resumo:
Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.