879 resultados para Francis, David: An invitation to ethnomethodology
Resumo:
We report the generation of a retroviral vector that infects human cells specifically through recognition of the low density lipoprotein receptor. The rationale for this targeted infection is to add onto the ecotropic envelope protein of Moloney murine leukemia virus, normally trophic for murine cells, a single-chain variable fragment derived from a monoclonal antibody recognizing the human low density lipoprotein receptor. This chimeric envelope protein was used to construct a packaging cell line producing a retroviral vector capable of high-efficiency transfer of the Escherichia coli beta-galactosidase gene to human cells expressing low density lipoprotein receptor. This approach offers a generalized plan to generate cell and tissue-specific retroviral vectors, an essential step toward in vivo gene therapy strategies.
Resumo:
Brain injury is the leading cause of disability and death in children in the United States. Student re-entry into the school setting following a traumatic brain injury is crucial to student success. Multidisciplinary teams within the school district comprised of individuals with expertise in brain injury are ideal in implementing student specific treatment plans given their specialized training and wide range of expertise addressing student needs. Therefore, the purpose of this study is to develop and initially validate a quantitative instrument that school personnel can use to determine if a student, identified as having a traumatic brain injury, will benefit from district-level consultation from a brain injury team. Three studies were designed to investigate the research questions. In study one, the planning and construction of the DORI-TBI was completed. Study two addressed the content validity of the DORI-TBI through a comparison analysis with other referral forms, content review with experts in the field of TBI, and cognitive interviews with professionals to test the usability of the new screening tool. In study three, a field administration was conducted using vignettes to measure construct validity. Results produced a valid and reliable new screening instrument that can aid school-based teams to more efficiently utilize district level consultation with a brain injury support team.
Resumo:
Author: Kerry W. Holton Title: SCHLEIERMACHER’S DOCTRINE OF BIBLICAL AUTHORITY: AN ALTERNATIVE TO CONTENT-BASED/SUPERNATURALIST AND FUNCTION- BASED/RATIONALIST MODELS Advisor: Theodore M. Vial, Jr. Degree Date: August 2015 This dissertation examines Friedrich Schleiermacher’s understanding of biblical authority and argues that, as an alternative to strictly supernaturalistic and rationalistic models, his understanding allows the New Testament to speak authoritatively in Christian religion in an age of critical, historical awareness. After classifying Schleiermacher’s position in a typology of the doctrine of biblical authority, this dissertation explores his conception of divine revelation and inspiration vis-à-vis scripture. It demonstrates that although he did not believe there is warrant for the claim of a direct connection between divine revelation and scripture, or that scripture is the foundation of faith, he nonetheless asserted that the New Testament is authoritative. He asserted the normative authority of the New Testament on the basis that it is the first presentation of Christian faith. This dissertation examines Schleiermacher’s “canon within the canon,” as well as his denial that the Old Testament shares the same normative worth and inspiration of the New. Although this dissertation finds difficulty with some of Schleiermacher’s views regarding the Old Testament, it names two significant strengths of what is identified as his evangelical, content-based, and rationalist approach to biblical authority. First, it recognizes and values the co-presence and co-activity of the supernatural and the natural !ii in the production of the New Testament canon. This allows both scripture and the church to share religious authority. Second, it allows Christian faith and the historical-method to coexist, as it does not require people to contradict what they know to be the case about science, history, and philosophy. Thus, this dissertation asserts that Schleiermacher’s understanding of biblical authority is a robust one, since, for him, the authority of scripture does not lie in some property of the texts themselves that historians or unbelievers can take away.
Resumo:
The idea of a conservation easement – restrictions on the development and use of land designed to protect the land’s conservation or historic values – can be relatively easily understood. More significant and more challenging is the complex body of state and federal laws that shapes the creation, funding, tax treatment, enforcement, modification, and termination of conservation easements. The explosion in the number of conservation easements over the past four decades has made them one of the most popular land protection mechanisms in the United States. The National Conservation Easement Database estimates that the total number of acres encumbered by conservation easements exceeds 40 million.Because conservation easements are both novel and ubiquitous, understanding how they actual work is essential for practicing lawyers, policymakers, land trust professionals, and students of conservation. This article provides a “quick tour” through some of the most important aspects of the developing mosaic of conservation easement law. It gives the reader a sense of the complex inter-jurisdictional dynamics that shape conservation transactions and disputes about conservation easements. Professors of property law, environmental law, tax law, and environmental studies who wish to cover conservation easements in the context of a more general course can use the article to provide their students with a broad but comprehensive overview of the relevant legal and policy issues.
Resumo:
The main objective of this paper is twofold: on the one hand, to analyse the impact that the announcement of the opening of a new hotel has on the performance of its chain by carrying out an event study, and on the other hand, to compare the results of two different approaches to this method: a parametric specification based on the autoregressive conditional heteroskedasticity models to estimate the market model, and a nonparametric approach, which implies employing Theil’s nonparametric regression technique, which in turn, leads to the so-called complete nonparametric approach to event studies. The results that the empirical application arrives at are noteworthy as, on average, the reaction to such news releases is highly positive, both approaches reaching the same level of significance. However, a word of caution must be said when one is not only interested in detecting whether the market reacts, but also in obtaining an exhaustive calculation of the abnormal returns to further examine its determining factors.
Resumo:
The objective of this paper is to develop a method to hide information inside a binary image. An algorithm to embed data in scanned text or figures is proposed, based on the detection of suitable pixels, which verify some conditions in order to be not detected. In broad terms, the algorithm locates those pixels placed at the contours of the figures or in those areas where some scattering of the two colors can be found. The hidden information is independent from the values of the pixels where this information is embedded. Notice that, depending on the sequence of bits to be hidden, around half of the used pixels to keep bits of data will not be modified. The other basic characteristic of the proposed scheme is that it is necessary to take into consideration the bits that are modified, in order to perform the recovering process of the information, which consists on recovering the sequence of bits placed in the proper positions. An application to banking sector is proposed for hidding some information in signatures.
Resumo:
The wide range of morphological variations in the “loxurina group” makes taxa identification difficult, and despite several reviews, serious taxonomical confusion remains. We make use of DNA data in conjunction with morphological appearance and available information on species distribution to delimit the boundaries of the “loxurina” group species previously established based on morphology. A fragment of 635 base pairs within the mtDNA gene cytochrome oxidase I (COI) was analysed for seven species of the “loxurina group”. Phylogenetic relationships among the included taxa were inferred using maximum parsimony and maximum likelihood methods. Penaincisalia sigsiga (Bálint et al), P. cillutincarae (Draudt), P. atymna (Hewitson) and P. loxurina (C. Felder & R. Felder) were easily delimited as the morphological, geographic and molecular data were congruent. Penaincisalia ludovica (Bálint & Wojtusiak) and P. loxurina astillero (Johnson) represent the same entity and constitute a sub-species of P. loxurina. However, incongruence among morphological, genetic, and geographic data is shown in P. chachapoya (Bálint & Wojtusiak) and P. tegulina (Bálint et al). Our results highlight that an integrative approach is needed to clarify the taxonomy of these neotropical taxa, but more genetic and geographical studies are still required.
Resumo:
I propose a method to study interactional ironic humorous utterances in Spanish. In GRIALE research group consider this method can be applied to humorous ironic utterances in different textual genres, from the violation of conversational principles. Futhermore, we present the General Theory of Verbal Humor proposed by Attardo that it will be taken in our analysis. Therefore, I study irony and humor in examples of conversations from Peninsular Spanish real sample corpuses (COVJA, Corpus de conversaciones coloquiales [Corpus of Colloquial Conversations] and CREA, Corpus de Referencia del Español Actual [Reference Corpus of Present-Day Spanish]). In this article, I will focus on the application of this theory to humorous ironic statements which arise in conversation and examine the effects caused by them, which will additionally verify if irony and humor coexist in the same conversational exchange with a communicative aim and conversational strategies.
Resumo:
Different kinds of algorithms can be chosen so as to compute elementary functions. Among all of them, it is worthwhile mentioning the shift-and-add algorithms due to the fact that they have been specifically designed to be very simple and to save computer resources. In fact, almost the only operations usually involved with these methods are additions and shifts, which can be easily and efficiently performed by a digital processor. Shift-and-add algorithms allow fairly good precision with low cost iterations. The most famous algorithm belonging to this type is CORDIC. CORDIC has the capability of approximating a wide variety of functions with only the help of a slight change in their iterations. In this paper, we will analyze the requirements of some engineering and industrial problems in terms of type of operands and functions to approximate. Then, we will propose the application of shift-and-add algorithms based on CORDIC to these problems. We will make a comparison between the different methods applied in terms of the precision of the results and the number of iterations required.
Resumo:
Purpose – Deontical impure systems are systems whose object set is formed by an s-impure set, whose elements are perceptuales significances (relative beings) of material and/or energetic objects (absolute beings) and whose relational set is freeways of relations, formed by sheaves of relations going in two-way directions and at least one of its relations has deontical properties such as permission, prohibition, obligation and faculty. The paper aims to discuss these issues. Design/methodology/approach – Mathematical and logical development of human society ethical and normative structure. Findings – Existence of relations with positive imperative modality (obligation) would constitute the skeleton of the system. Negative imperative modality (prohibition) would be the immunological system of protection of the system. Modality permission the muscular system, that gives the necessary flexibility. Four theorems have been formulated based on Gödel's theorem demonstrating the inconsistency or incompleteness of DISs. For each constructed systemic conception can happen to it one of the two following things: either some allowed responses are not produced or else some forbidden responses are produced. Responses prohibited by the system are defined as nonwished effects. Originality/value – This paper is a continuation of the four previous papers and is developed the theory of deontical impure systems.
Resumo:
Purpose – This paper aims to refer to a subjective approach to a type of complex system: human ecosystems, referred to as deontical impure systems (DIS) to capture a set of properties fundamental to the distinction between human and natural ecosystems. There are four main phenomenological components: directionality, intensity, connection energy and volume. The paper establishes thermodynamics of deontical systems based on the Law of Zipf and the temperature of information. Design/methodology/approach – Mathematical and logical development of human society structure. Findings – A fundamental question in this approach to DIS is the intensity or forces of a relation. Concepts are introduced as the system volume and propose a system thermodynamic theory. It hints at the possibility of adapting the fractal theory by introducing the fractal dimension of the system. Originality/value – This paper is a continuation of other previous papers and developing the theory of DIS.
Resumo:
In order to equip architecture students with ambitious detail designing ability, related courses of the architecture programs should deal with subjects which are rather rare and unusual for real life practices in order to prevent students copying standard details. In 2015 an innovative project brief has been given to architecture students of Istanbul Technical University. The scenario given in the brief is to design a research station for the first group of Turkish scientist, to be built in one of the coldest and most arid regions on earth; the Antarctica. The performance requirements given in the brief were determined to prevent the students from copying details from any kind of resources as the total number of details generated in real life for those conditions are very limited and specific. The method used has demonstrated a great success and creative detail solutions were generated by the students. In the paper, the innovative coursework brief for bettering the detail design ability of architecture students is explained and the output of the studio is presented.
Resumo:
On a global level the population growth and increase of the middle class lead to a growing demand on material resources. The built environment has an enormous impact on this scarcity. In addition, a surplus of construction and demolition waste is a common problem. The construction industry claims to recycle 95% of this waste but this is in fact mainly downcycling. Towards the circular economy, the quality of reuse becomes of increasing importance. Buildings are material warehouses that can contribute to this high quality reuse. However, several aspects to achieve this are unknown and a need for more insight into the potential for high quality reuse of building materials exists. Therefore an instrument has been developed that determines the circularity of construction waste in order to maximise high quality reuse. The instrument is based on three principles: ‘product and material flows in the end of life phase’, ‘future value of secondary materials and products’ and ‘the success of repetition in a new life cycle’. These principles are further divided into a number of criteria to which values and weighting factors are assigned. A degree of circularity can then be determined as a percentage. A case study for a typical 70s building is carried out. For concrete, the circularity is increased from 25% to 50% by mapping out the potential for high quality reuse. During the development of the instrument it was clarified that some criteria are difficult to measure. Accurate and reliable data are limited and assumptions had to be made. To increase the reliability of the instrument, experts have reviewed the instrument several times. In the long-term, the instrument can be used as a tool for quantitative research to reduce the amount of construction and demolition waste and contribute to the reduction of raw material scarcity.
Resumo:
Due to confidentiality considerations, the microdata available from the 2011 Spanish Census have been codified at a provincial (NUTS 3) level except when the municipal (LAU 2) population exceeds 20,000 inhabitants (a requirement that is met by less than 5% of all municipalities). For the remainder of the municipalities within a given province, information is only provided for their classification in wide population intervals. These limitations, hampering territorially-focused socio-economic analyses, and more specifically, those related to the labour market, are observed in many other countries. This article proposes and demonstrates an automatic procedure aimed at delineating a set of areas that meet such population requirements and that may be used to re-codify the geographic reference in these cases, thereby increasing the territorial detail at which individual information is available. The method aggregates municipalities into clusters based on the optimisation of a relevant objective function subject to a number of statistical constraints, and is implemented using evolutionary computation techniques. Clusters are defined to fit outer boundaries at the level of labour market areas.