955 resultados para arguments by definition
Resumo:
I conducted this study to provide insights toward deepening understanding of association between culture and writing by building, assessing, and refining a conceptual model of second language writing. To do this, I examined culture and coherence as well as the relationship between them through a mixed methods research design. Coherence has been an important and complex concept in ESL/EFL writing. I intended to study the concept of coherence in the research context of contrastive rhetoric, comparing the coherence quality in argumentative essays written by undergraduates in Mainland China and their U.S. peers. In order to analyze the complex concept of coherence, I synthesized five linguistic theories of coherence: Halliday and Hasan's cohesion theory, Carroll's theory of coherence, Enkvist's theory of coherence, Topical Structure Analysis, and Toulmin's Model. Based upon the synthesis, 16 variables were generated. Across these 16 variables, Hotelling t-test statistical analysis was conducted to predict differences in argumentative coherence between essays written by two groups of participants. In order to complement the statistical analysis, I conducted 30 interviews of the writers in the studies. Participants' responses were analyzed with open and axial coding. By analyzing the empirical data, I refined the conceptual model by adding more categories and establishing associations among them. The study found that U.S. students made use of more pronominal reference. Chinese students adopted more lexical devices of reiteration and extended paralleling progression. The interview data implied that the difference may be associated with the difference in linguistic features and rhetorical conventions in Chinese and English. As far as Toulmin's Model is concerned, Chinese students scored higher on data than their U.S. peers. According to the interview data, this may be due to the fact that Toulmin's Model, modified as three elements of arguments, have been widely and long taught in Chinese writing instruction while U.S. interview participants said that they were not taught to write essays according to Toulmin's Model. Implications were generated from the process of textual data analysis and the formulation of structural model defining coherence. These implications were aimed at informing writing instruction, assessment, peer-review, and self-revision.
Resumo:
Many culturally and linguistically diverse (CLD) students with specific learning disabilities (SLD) struggle with the writing process. Particularly, they have difficulties developing and expanding ideas, organizing and elaborating sentences, and revising and editing their compositions (Graham, Harris, & Larsen, 2001; Myles, 2002). Computer graphic organizers offer a possible solution to assist them in their writing. This study investigated the effects of a computer graphic organizer on the persuasive writing compositions of Hispanic middle school students with SLD. A multiple baseline design across subjects was used to examine its effects on six dependent variables: number of arguments and supporting details, number and percentage of transferred arguments and supporting details, planning time, writing fluency, syntactical maturity (measured by T-units, the shortest grammatical sentence without fragments), and overall organization. Data were collected and analyzed throughout baseline and intervention. Participants were taught persuasive writing and the writing process prior to baseline. During baseline, participants were given a prompt and asked to use paper and pencil to plan their compositions. A computer was used for typing and editing. Intervention required participants to use a computer graphic organizer for planning and then a computer for typing and editing. The planning sheets and written composition were printed and analyzed daily along with the time each participant spent on planning. The use of computer graphic organizers had a positive effect on the planning and persuasive writing compositions. Increases were noted in the number of supporting details planned, percentage of supporting details transferred, planning time, writing fluency, syntactical maturity in number of T-units, and overall organization of the composition. Minimal to negligible increases were noted in the mean number of arguments planned and written. Varying effects were noted in the percent of transferred arguments and there was a decrease in the T-unit mean length. This study extends the limited literature on the effects of computer graphic organizers as a prewriting strategy for Hispanic students with SLD. In order to fully gauge the potential of this intervention, future research should investigate the use of different features of computer graphic organizer programs, its effects with other writing genres, and different populations.
Resumo:
This thesis presents a certification method for semantic web services compositions which aims to statically ensure its functional correctness. Certification method encompasses two dimensions of verification, termed base and functional dimensions. Base dimension concerns with the verification of application correctness of the semantic web service in the composition, i.e., to ensure that each service invocation given in the composition comply with its respective service definition. The certification of this dimension exploits the semantic compatibility between the invocation arguments and formal parameters of the semantic web service. Functional dimension aims to ensure that the composition satisfies a given specification expressed in the form of preconditions and postconditions. This dimension is formalized by a Hoare logic based calculus. Partial correctness specifications involving compositions of semantic web services can be derived from the deductive system proposed. Our work is also characterized by exploiting the use of a fragment of description logic, i.e., ALC, to express the partial correctness specifications. In order to operationalize the proposed certification method, we developed a supporting environment for defining the semantic web services compositions as well as to conduct the certification process. The certification method were experimentally evaluated by applying it in three different proof concepts. These proof concepts enabled to broadly evaluate the method certification
Resumo:
The turn to neoliberalism in the 1990s proved decisive for Mexico, as the NAFTA project embraced by the Salinas administration entailed a re-definition of national identity, defined since the revolution as mestizo, Catholic and especially as the Other to the United States. And just as cinema was in those days a crucial discourse for this particular construction of the identity, it was in the 1990s equally instrumental to its redefinition, which largely focused on the role of migrants to the US, presented even as supplementary in the Derridean sense. In 1992, as part of these efforts, Sergio Arau directed a mockumentary which in 2004 became a feature film, ‘A Day Without a Mexican’. As would befit more the seriousness of a documentary than the excess and parody of mockumentary, the stated aim in both was to advance a social agenda, arguing the case for immigrant labour and for Mexican presence in the US more generally. The film charts what would happen in California were all Latino immigrants to suddenly disappear, arguing chaos would ensue. Given the link between cinema and modernity and the relevance of cinema for the nation as an alternative public sphere, this chapter looks at the implications of choosing mockumentary, taken by many to be a paradigmatic postmodern and hybrid form, to discuss the hybridisation of national identity in a transnational film, in the present age of globalisation.
Resumo:
Traditionally, importance has been measured using subjective measures. The present thesis explores the possibility of a second type of importance, designated as “associative importance”. A new measure, the IIAT, was designed to capture the strength of association between an object and the attribute of importance. This thesis then evaluated the validity of the IIAT via an intervention paradigm in 2 studies, and by using the measure to predict a memory outcome in 2 other studies. Subjective measures of importance were also included in these studies and correlations between subjective measures and IIAT results were examined. Across all 4 studies, subjective-objective correlations were weak to modest and non-significant. The intervention studies provided promising evidence that interventions do affect associative importance as measured by the IIAT. The prediction studies provided somewhat mixed, but encouraging evidence that the IIAT may be able to predict memory performance. Notably, subjective measures were not able to predict memory performance at all, whereas the IIAT was able to predict some memory indices. Overall, there is some evidence supporting the existence of an associative importance construct, and that the IIAT provides valid results that are nonetheless different from that of subjective measures of attitude importance.
Resumo:
The definition of the boundaries of the firms is subject that has occupied the organizational theorists long ago, being the seminal work of Coase (1937) indicated as the trigger for one theoretical evolution, with emphasis on governance structures, which led to a modern theory of incomplete contracts. The Transaction Cost Economics (TCE) and Agency Theory arise within this evolution, being widely used in studies related to the theme. Empirically, data envelopment analysis (DEA) has established itself as a suitable tool for analysis of efficiency. Although TCE argues that specific assets must be internalized, recent studies outside the mainstream of theory show that, often, firms may decide, for various reasons, hire them on the market. Researches on transaction costs face with the unavailability of information and methodological difficulties in measuring their critical variables. There`s still the need for further methodological deepening. The theoretical framework includes classic works of TCE and Agency Theory, but also more recent works, outside the mainstream of TCE, which warn about the existence of strategies in use of specific assets that aren`t necessarily aligned with the classical ideas of TCE. The Brazilian oil industry is the focus of this thesis, that aimed to evaluate the efficiency of contracts involving high specificity service outsourced by Petrobras. In order to this, we made the categorization of outsourced services in terms of specificity, as well the description of services with higher specificity. Then, we verified the existence of relationship between the specificity of services and a number of variables, being found divergent results than those that are preached by the mainstream of TCE. Then, we designed a DEA model to analyze the efficiency in the use of onshore drilling rigs, identified among the services of highest specificity. The next step was the application of the model to evaluate the performance of drilling rigs contracts. Finally, we verified the existence of relationship between the efficiency of contracts and a number of variables, being found, again, results not consistent with the theory mainstream. Regarding to analyze of efficiency of drilling rigs contracts, the model developed is compatible with what is found in academic productions in efficiency of drilling rigs. The results on efficiency show a wide range of scores, with efficiencies ranging from 31.79% to 100%, being low the sample efficiency average. There is consonance between the model results and the practices adopted by Petrobras. The results strengthen the DEA as an important tool in studies of efficiency with possibility to use for analysis other types of contracts. In terms of theoretical findings, the results reinforce the arguments that there are situations in which the strategies of the organizations, in terms of use of assets and services of high specificity, do not necessarily follow what is recommended by the mainstream of TCE
Resumo:
Task-based approach implicates identifying all the tasks developed in each workplace aiming to refine the exposure characterization. The starting point of this approach is the recognition that only through a more detailed and comprehensive understanding of tasks is possible to understand, in more detail, the exposure scenario. In addition allows also the most suitable risk management measures identification. This approach can be also used when there is a need of identifying the workplace surfaces for sampling chemicals that have the dermal exposure route as the most important. In this case is possible to identify, through detail observation of tasks performance, the surfaces that involves higher contact (frequency) by the workers and can be contaminated. Identify the surfaces to sample when performing occupational exposure assessment to antineoplasic agents. Surfaces selection done based on the task-based approach.
Resumo:
The ligaments of the wrist are highly variable and poorly described, which is more obvious on the ulnar side of the wrist. Previous studies highlighted the potential differences within the ligaments of the wrist but no consensus has been reached. Poor tissue description and inconsistent use of terminology hindered the reproducibility of the results. Improved understanding of the morphological variations between carpal bones may facilitate improved understanding of the ligamentous structure within the wrist. This study aims to identify the potential variations between carpal bones that could be used to separate palmar ligamentous patterns around the triquetrum-hamate joint into subgroups within the sample population. Investigations were performed following a detailed nomenclature and a clear definition of ligamentous structures to facilitate detailed description and reproducible results. Quantitative analyses were conducted using 3D modelling technique. Histological sections were then analysed to identify the structure of each ligamentous attachment. Variable patterns of ligamentous attachments were identified. Differences were not only obvious between samples but also between the right and left hands of the same person. These identifications suggested that the palmar ligamentous patterns around the triquetrum-hamate joint are best described as a spectrum with a higher affinity of the triquetrum-hamate-capitate ligament and the lunate-triquetrum ligament to be associated with type I lunate wrists on one extreme and type II lunate wrists with the palmar triquetrum-hamate ligament, triquetrum-hamate-capitate ligament and palmar radius-lunate-triquetrum ligament attachments at the other extreme. Histological analyses confirmed pervious established work regarding the mechanical role of ligaments in wrist joint biomechanics. Also, there were no significant differences between the quantitative data obtained from the Genelyn-embalmed and unembalmed specimens (p>0.05). The current study demonstrated variable ligamentous patterns that suggest different bone restraints and two different patterns of motion. These findings support previous suggestions regarding separating the midcarpal joint into two distinct functional types. Type I wrists were identified with ligamentous attachments that are suggestive of rotating/translating hamate whilst type II wrists identified with ligamentous attachments that are suggestive of flexing/extending hamate motion based upon the patterns of the ligamentous attachments in relation to the morphological features of the underlying lunate type of the wrist. This opens the horizon for particular consideration and/or modification of surgical procedures, which may enhance the clinical management of wrist dysfunction.
Resumo:
This essay presents a series of discussions revolving around the discipline of library professionals who are trained in it.It covers the concept of Library from the formal point of view, contributed by some authors and dictionaries, but also from the particular vision of the author, who conceptualized from the working level, academy, research and daily contact with users.Therefore, it is a short journey from the traditional to the current reality with the inclusion of information technology and communication. But above all, a journey of a very personal feel.
Resumo:
Since turning professional in 1995 there have been considerable advances in the research on the demands of rugby union, largely using Global Positioning System (GPS) analysis over the last 10 years. A systematic review on the use of GPS, particularly the setting of absolute (ABS) and individual (IND) velocity bands in field based, intermittent, high-intensity (HI) team sports was undertaken. From 3669 records identified, 38 studies were included for qualitative analysis. Little agreement on the definition of movement intensities within team sports was found, only three papers, all on rugby union, had used IND bands, with only one comparing ABS and IND methods. Thus, the aim of this study was to determine if there is a difference in the demands within positions when comparing ABS and IND methods for GPS analysis and if these differences are significantly different between the forward and back positional groups. A total of 214 data files were recorded from 26 players in 17 matches of the 2015/2016 Scottish BT Premiership. ABS velocity zones 1-7 were set at 1) 0-6, 2) 6.1-11, 3) 11.1-15, 4) 15.1-18, 5) 18.1-21, 6) 21.1-15 and 7) 25.1-40km.h-1 while IND zones 1-7 were 1) <20, 2) 20-40, 3) 40-50, 4) 50-70, 5) 70-80, 6) 80-95 and 7) 95-100% of player’s individually determined maximum velocity (Vmax). A 40m sprint test measured Vmax using OptaPro S4 10 Hz (catapult, Australia) GPS units to derive IND bands. The same GPS units were worn during matches. GPS outputs analysed were % distance, % time, high intensity efforts (HIEs) over 18.1 km.h-1 / 70% max velocity and repeated high intensity efforts (RHIEs) which consists of three HIEs in 21secs. General linear model (GLM) analysis identified a significant difference in the measurement of % total distance covered, between the ABS and IND methods in all zones for forwards (p<0.05) and backs (p<0.05). This difference was also significant between forwards and backs in zones 1, shown as mean difference ± standard deviation (3.7±0.7%), 6 (1.2±0.4%) and 7 (1.0±0.0%) respectively (p<0.05). Percentage time estimations were significantly different between ABS and IND analysis within forwards in zones 1 (1.7±1.7%), 2 (-2.9±1.3%), 3 (1.9±0.8%), 4 (-1.4±0.8%) and 5 (0.2±0.4%), and within backs in zones 1 (-10±1.5%), 2 (-1.2±1.1%), 3 (1.8±0.9%) and 5 (0.6±0.5%) (p<0.05). The difference between groups was significant in zones 1, 2, 4 and 5 (p<0.05). The number of HIEs was significantly different between forwards and backs in zones 6 (6±2) and 7 (3±2). RHIEs were significantly different between ABS and IND for forwards (1±2, p<0.05) although not between groups. Until more research on the differences in ABS and IND methods is carried out, then neither can be deemed a criterion method. In conclusion, there are significant differences between the ABS and IND methods of GPS analysis of the physical demands of rugby union, which must be considered when used to inform training load and recovery to improve performance and reduce injuries.
Resumo:
The definition of the boundaries of the firms is subject that has occupied the organizational theorists long ago, being the seminal work of Coase (1937) indicated as the trigger for one theoretical evolution, with emphasis on governance structures, which led to a modern theory of incomplete contracts. The Transaction Cost Economics (TCE) and Agency Theory arise within this evolution, being widely used in studies related to the theme. Empirically, data envelopment analysis (DEA) has established itself as a suitable tool for analysis of efficiency. Although TCE argues that specific assets must be internalized, recent studies outside the mainstream of theory show that, often, firms may decide, for various reasons, hire them on the market. Researches on transaction costs face with the unavailability of information and methodological difficulties in measuring their critical variables. There`s still the need for further methodological deepening. The theoretical framework includes classic works of TCE and Agency Theory, but also more recent works, outside the mainstream of TCE, which warn about the existence of strategies in use of specific assets that aren`t necessarily aligned with the classical ideas of TCE. The Brazilian oil industry is the focus of this thesis, that aimed to evaluate the efficiency of contracts involving high specificity service outsourced by Petrobras. In order to this, we made the categorization of outsourced services in terms of specificity, as well the description of services with higher specificity. Then, we verified the existence of relationship between the specificity of services and a number of variables, being found divergent results than those that are preached by the mainstream of TCE. Then, we designed a DEA model to analyze the efficiency in the use of onshore drilling rigs, identified among the services of highest specificity. The next step was the application of the model to evaluate the performance of drilling rigs contracts. Finally, we verified the existence of relationship between the efficiency of contracts and a number of variables, being found, again, results not consistent with the theory mainstream. Regarding to analyze of efficiency of drilling rigs contracts, the model developed is compatible with what is found in academic productions in efficiency of drilling rigs. The results on efficiency show a wide range of scores, with efficiencies ranging from 31.79% to 100%, being low the sample efficiency average. There is consonance between the model results and the practices adopted by Petrobras. The results strengthen the DEA as an important tool in studies of efficiency with possibility to use for analysis other types of contracts. In terms of theoretical findings, the results reinforce the arguments that there are situations in which the strategies of the organizations, in terms of use of assets and services of high specificity, do not necessarily follow what is recommended by the mainstream of TCE
Resumo:
Across the international educational landscape, numerous higher education institutions (HEIs) offer postgraduate programmes in occupational health psychology (OHP). These seek to empower the next generation of OHP practitioners with the knowledge and skills necessary to advance the understanding and prevention of workplace illness and injury, improve working life and promote healthy work through the application of psychological principles and practices. Among the OHP curricula operated within these programmes there exists considerable variability in the topics addressed. This is due, inter alia, to the youthfulness of the discipline and the fact that the development of educational provision has been managed at the level of the HEI where it has remained undirected by external forces such as the discipline’s representative bodies. Such variability makes it difficult to discern the key characteristics of a curriculum which is important for programme accreditation purposes, the professional development and regulation of practitioners and, ultimately, the long-term sustainability of the discipline. This chapter has as its focus the imperative for and development of consensus surrounding OHP curriculum areas. It begins by examining the factors that are currently driving curriculum developments and explores some of the barriers to such. It then reviews the limited body of previous research that has attempted to discern key OHP curriculum areas. This provides a foundation upon which to describe a study conducted by the current authors that involved the elicitation of subject matter expert opinion from an international sample of academics involved in OHP-related teaching and research on the question of which topic areas might be considered important for inclusion within an OHP curriculum. The chapter closes by drawing conclusions on steps that could be taken by the discipline’s representative bodies towards the consolidation and accreditation of a core curriculum.
Resumo:
Many production systems have acquisition and merge operations to increase productivity. This paper proposes a novel method to anticipate whether a merger in a market is generating a major or a minor consolidation, using InvDEA model. A merger between two or more decision making units (DMUs) producing a single merged DMU that affects the efficiency frontier, defined by the pre-consolidation market conditions, is called a major consolidation. The corresponding alternative case is called a minor consolidation. A necessary and sufficient condition to distinguish the two types of consolidations is proven and two numerical illustrations in banking and supply chain management are discussed. The crucial importance of anticipating the magnitude of a consolidation in a market is outlined.
Resumo:
This article explores the struggle for legitimation associated with the attempt to define the risk of Bt cotton, a genetically modified crop, in Andhra Pradesh, India. Beck asserts that, given the uncertainty associated with risk society, efforts to define risk are creating the need for a new political culture. This article argues that this political culture emerges from attempts to legitimate power within risk definition. This is examined using critical discourse analysis on interview excerpts with key figures in the Bt cotton debate. Legitimation is explored using the categories of legitimation developed by Van Leeuwen. These are (a) authorisation; (b) moral evaluation; (c) rationalisation; and (d) mythopoesis. The analysis highlights that the political culture which emerges in response to risk society is in a state of constant flux and contingent upon the ongoing struggle for legitimation with regard to the definition of risk.