857 resultados para Critical Approach


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis is an examination of the ASEAN’s prospects in establishing regional competition policy in the Southeast Asia region, a topic of contemporary relevance in light of the ASEAN’s recent foray into the economic integration field on 31 December 2015. It questions whether the current approach undertaken by the ASEAN could contribute to an effective regional competition policy under the regional market integration. In answering this question, the thesis first critically surveys the current terrain of regional competition laws and policies in order to determine the possible existence of an optimal template. It argues that although the EU model is oft used as a source of inspiration, each regional organisation conceives different configurations of the model in order to best adjust to the local regional contexts. The thesis makes an inquiry into the narratives of the ASEAN’s competition policy, as well as the ASEAN’s specific considerations in the development of competition policy, before comparing the findings to the actual approaches taken by the ASEAN in its pursuit of regional competition policy. This thesis reveals that the actual approach taken by the ASEAN demonstrates an important discrepancy from the economic integration goal. The ASEAN applies a soft harmonisation approach regarding substantive competition law while refraining from establishing a centralised institution or a representative institution. The sole organ with regards to competition policy at the regional level is an expert organ. The thesis also conducts an investigation into the reception of the ASEAN’s regional policy by the member states in order to ascertain the possibility of the achievement of the ASEAN’s aspiration of regional competition policy. The study reveals that despite some shared similarities in the broad principles of competition law amongst the member states, the various competition law regimes are not harmonised thus creating challenging obstacle to the ASEAN’s ambition. The thesis then concludes that the ASEAN’s approach to regional competition law is unlikely to be effective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This chapter offers a framework for combining critical language policy with critical discourse studies (CDS) to analyse language policy as a process in the context of minority language policy in Wales. I propose a discursive approach to language policy, which starts from the premise that language policy is constituted, enacted, interpreted and (re)contextualised in and through language. This approach extends the critical language policy framework provided by Shohamy (Language policy: hidden agendas and new approaches. Routledge, London, 2006) and integrates perspectives from the context-sensitive discourse-historical approach in CDS. It incorporates discourse as an essential lens through which policy mechanisms, ideologies and practices are constituted and de facto language policy materialises. This chapter argues that conceptualising and analysing language policy as a discursive phenomenon enables a better understanding of the multi-layered nature of language policy that shapes the management and experience of corporate bilingualism in Wales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to test if the critical power model can be used to determine the critical rest interval (CRI) between vertical jumps. Ten males performed intermittent countermovement jumps on a force platform with different resting periods (4.1 +/- 0.3 s, 5.0 +/- 0.4 s, 5.9 +/- 0.6 s). Jump trials were interrupted when participants could no longer maintain 95% of their maximal jump height. After interruption, number of jumps, total exercise duration and total external work were computed. Time to exhaustion (s) and total external work (J) were used to solve the equation Work = a + b . time. The CRI (corresponding to the shortest resting interval that allowed jump height to be maintained for a long time without fatigue) was determined dividing the average external work needed to jump at a fixed height (J) by b parameter (J/s). in the final session, participants jumped at their calculated CRI. A high coefficient of determination (0.995 +/- 0.007) and the CRI (7.5 +/- 1.6 s) were obtained. In addition, the longer the resting period, the greater the number of jumps (44 13, 71 28, 105 30, 169 53 jumps; p<0.0001), time to exhaustion (179 +/- 50, 351 +/- 120, 610 +/- 141, 1,282 +/- 417 s; p<0.0001) and total external work (28.0 +/- 8.3, 45.0 +/- 16.6, 67.6 +/- 17.8, 111.9 +/- 34.6 kJ; p<0.0001). Therefore, the critical power model may be an alternative approach to determine the CRI during intermittent vertical jumps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermal performance of a cooling tower and its cooling water system is critical for industrial plants, and small deviations from the design conditions may cause severe instability in the operation and economics of the process. External disturbances such as variation in the thermal demand of the process or oscillations in atmospheric conditions may be suppressed in multiple ways. Nevertheless, such alternatives are hardly ever implemented in the industrial operation due to the poor coordination between the utility and process sectors. The complexity of the operation increases because of the strong interaction among the process variables. In the present work, an integrated model for the minimization of the operating costs of a cooling water system is developed. The system is composed of a cooling tower as well as a network of heat exchangers. After the model is verified, several cases are studied with the objective of determining the optimal operation. It is observed that the most important operational resources to mitigate disturbances in the thermal demand of the process are, in this order: the increase in recycle water flow rate, the increase in air flow rate and finally the forced removal of a portion of the water flow rate that enters the cooling tower with the corresponding make-up flow rate. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article intends to rationally reconstruct Locke`s theory of knowledge as incorporated in a research program concerning the nature and structure of the theories and models of rationality. In previous articles we argued that the rationalist program can be subdivided into the classical rationalistic subprogram, which includes the knowledge theories of Descartes, Locke, Hume and Kant, the neoclassical subprogram, which includes the approaches of Duhem, Poincare and Mach, and the critical subprogram of Popper. The subdivision results from the different views of rationality proposed by each one of these subprograms, as well as from the tools made available by each one of them, containing theoretical instruments used to arrange, organize and develop the discussion on rationality, the main one of which is the structure of solution of problems. In this essay we intend to reconstruct the assumptions of Locke`s theory of knowledge, which in our view belongs to the classical rationalistic subprogram because it shares with it the thesis of the identity of (scientific) knowledge and certain knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rearrangements of 1p36 are the most frequently detected abnormalities in diagnostic testing for chromosomal cryptic imbalances and include variably sized simple terminal deletions, derivative chromosomes, interstitial deletions, and complex rearrangements. These rearrangements result in the specific pattern of malformation and neurodevelopmental disabilities that characterizes monosomy 1p36 syndrome. Thus far, no individual gene within this region has been conclusively determined to be causative of any component of the phenotype. Nor is it known if the rearrangements convey phenotypes via a haploinsufficiency mechanism or through a position effect. We have used multiplex ligation-dependent probe amplification to screen for deletions of 1p36 in a group of 154 hyperphagic and overweight/obese, PWS negative individuals, and in a separate group of 83 patients initially sent to investigate a variety of other conditions. The strategy allowed the identification and delineation of rearrangements in nine subjects with a wide spectrum of clinical presentations. Our work reinforces the association of monosomy 1p36 and obesity and hyperphagia, and further suggests that these features may be associated with non-classical manifestations of this disorder in addition to a submicroscopic deletion of similar to 2-3 Mb in size. Multiplex ligation probe amplification using the monosomy 1p36 syndrome-specific kit coupled to the subtelomeric kit is an effective approach to identify and delineate rearrangements at 1p36. (C) 2009 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A neoadjuvant multimodality approach with chemoradiation therapy (CRT) is the preferred treatment strategy for most distal rectal cancers. Significant downstaging and complete pathologic response may develop after this strategy, and there is still controversy regarding the management of these patients. In this setting, a nonoperative approach has been suggested in select patients with complete clinical response after thorough clinical, endoscopic, and radiologic assessment. However, the assessment of these patients is not straightforward and remains complex. Available data regarding this approach are limited to a single institution`s experience from retrospective analyses. Standardization of the assessment of tumor response and the development of radiological/molecular tools may clarify the role of no immediate surgery in patients with complete clinical response after neoadjuvant CRT. Advances in radiation and medical oncology could potentially lead to significant improvements in complete tumor regression rates, leading to an increase in importance of a minimally invasive approach in patients with rectal cancer. Semin Radiat Oncol 21:234-239 (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The pterygopalatine fossa (PPF) is a narrow space located between the posterior wall of the antrum and the pterygoid plates. Surgical access to the PPF is difficult because of its protected position and its complex neurovascular anatomy. Endonasal approaches using rod lens endoscopes, however, provide better visualization of this area and are associated with less morbidity than external approaches. Our aim was to develop a simple anatomical model using cadaveric specimens injected with intravascular colored silicone to demonstrate the endoscopic anatomy of the PPF. This model could be used for surgical instruction of the transpterygoid approach. Methods: We dissected six PPF in three cadaveric specimens prepared with intravascular injection of colored material using two different injection techniques. An endoscopic endonasal approach, including a wide nasoantral window and removal of the posterior antrum wall, provided access to the PPF. Results: We produced our best anatomical model injecting colored silicone via the common carotid artery. We found that, using an endoscopic approach, a retrograde dissection of the sphenopalatine artery helped to identify the internal maxillary artery (IMA) and its branches. Neural structures were identified deeper to the vascular elements. Notable anatomical landmarks for the endoscopic surgeon are the vidian nerve and its canal that leads to the petrous portion of the internal carotid artery (ICA), and the foramen rotundum, and V2 that leads to Meckel`s cave in the middle cranial fossa. These two nerves, vidian and V2, are separated by a pyramidal shaped bone and its apex marks the ICA. Conclusion: Our anatomical model provides the means to learn the endoscopic anatomy of the PPF and may be used for the simulation of surgical techniques. An endoscopic endonasal approach provides adequate exposure to all anatomical structures within the PPF. These structures may be used as landmarks to identify and control deeper neurovascular structures. The significance is that an anatomical model facilitates learning the surgical anatomy and the acquisition of surgical skills. A dissection superficial to the vascular structures preserves the neural elements. These nerves and their bony foramina, such as the vidian nerve and V2, are critical anatomical landmarks to identify and control the ICA at the skull base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. The goal of this paper is to undertake a literature search collecting all dentin bond strength data obtained for six adhesives with four tests ( shear, microshear, tensile and microtensile) and to critically analyze the results with respect to average bond strength, coefficient of variation, mode of failure and product ranking. Method. A PubMed search was carried out for the years between 1998 and 2009 identifying publications on bond strength measurements of resin composite to dentin using four tests: shear, tensile, microshear and microtensile. The six adhesive resins were selected covering three step systems ( OptiBond FL, Scotch Bond Multi-Purpose Plus), two-step (Prime & Bond NT, Single Bond, Clear. l SE Bond) and one step (Adper Prompt L Pop). Results. Pooling results from 147 references showed an ongoing high scatter in the bond strength data regardless which adhesive and which bond test was used. Coefficients of variation remained high (20-50%) even with the microbond test. The reported modes of failure for all tests still included high number of cohesive failures. The ranking seemed to be dependant on the test used. Significance. The scatter in dentin bond strength data remains regardless which test is used confirming Finite Element Analysis predicting non-uniform stress distributions due to a number of geometrical, loading, material properties and specimens preparation variables. This reopens the question whether, an interfacial fracture mechanics approach to analyze the dentin - adhesive bond is not more appropriate for obtaining better agreement among dentin bond related papers. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a model for the dynamics of a patchy population in a stochastic environment and derive a criterion for its persistence. This criterion is based on the geometric mean (GM) through time of the spatial-arithmetic mean of growth rates. For the population to persist, the GM has to be greater than or equal to1. The GM increases with the number of patches (because the sampling error is reduced) and decreases with both the variance and the spatial covariance of growth rates. We derive analytical expressions for the minimum number of patches (and the maximum harvesting rate) required for the persistence of the population. As the magnitude of environmental fluctuations increases, the number of patches required for persistence increases, and the fraction of individuals that can be harvested decreases. The novelty of our approach is that we focus on Malthusian local population dynamics with high dispersal and strong environmental variability from year to year. Unlike previous models of patchy populations that assume an infinite number of patches, we focus specifically on the effect that the number of patches has on population persistence. Our work is therefore directly relevant to patchily distributed organisms that are restricted to a small number of habitat patches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As marketers and researchers we understand quality from the consumer's perspective, and throughout contemporary service quality literature there is an emphasis on what the consumer is looking for, or at least that is the intention. Through examining the underlying assumptions of dominant service quality theories, an implicit dualistic ontology is highlighted (where subject and object are considered independent) and argued to effectively negate the said necessary consumer orientation. This fundamental assumption is discussed, as are the implications, following a critical review of dominant service quality models. Consequently, we propose an alternative approach to service quality research that aims towards a more genuine understanding of the consumer's perspective on quality experienced within a service context. Essentially, contemporary service quality research is suggested to be limited in its inherent third-person perspective and the interpretive, specifically phenomenographic, approach put forward here is suggested as a means of achieving a first-person perspective on service quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Load-Unload Response Ratio (LURR) method is an intermediate-term earthquake prediction approach that has shown considerable promise. It involves calculating the ratio of a specified energy release measure during loading and unloading where loading and unloading periods are determined from the earth tide induced perturbations in the Coulomb Failure Stress on optimally oriented faults. In the lead-up to large earthquakes, high LURR values are frequently observed a few months or years prior to the event. These signals may have a similar origin to the observed accelerating seismic moment release (AMR) prior to many large earthquakes or may be due to critical sensitivity of the crust when a large earthquake is imminent. As a first step towards studying the underlying physical mechanism for the LURR observations, numerical studies are conducted using the particle based lattice solid model (LSM) to determine whether LURR observations can be reproduced. The model is initialized as a heterogeneous 2-D block made up of random-sized particles bonded by elastic-brittle links. The system is subjected to uniaxial compression from rigid driving plates on the upper and lower edges of the model. Experiments are conducted using both strain and stress control to load the plates. A sinusoidal stress perturbation is added to the gradual compressional loading to simulate loading and unloading cycles and LURR is calculated. The results reproduce signals similar to those observed in earthquake prediction practice with a high LURR value followed by a sudden drop prior to macroscopic failure of the sample. The results suggest that LURR provides a good predictor for catastrophic failure in elastic-brittle systems and motivate further research to study the underlying physical mechanisms and statistical properties of high LURR values. The results provide encouragement for earthquake prediction research and the use of advanced simulation models to probe the physics of earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For zygosity diagnosis in the absence of genotypic data, or in the recruitment phase of a twin study where only single twins from same-sex pairs are being screened, or to provide a test for sample duplication leading to the false identification of a dizygotic pair as monozygotic, the appropriate analysis of respondents' answers to questions about zygosity is critical. Using data from a young adult Australian twin cohort (N = 2094 complete pairs and 519 singleton twins from same-sex pairs with complete responses to all zygosity items), we show that application of latent class analysis (LCA), fitting a 2-class model, yields results that show good concordance with traditional methods of zygosity diagnosis, but with certain important advantages. These include the ability, in many cases, to assign zygosity with specified probability on the basis of responses of a single informant (advantageous when one zygosity type is being oversampled); and the ability to quantify the probability of misassignment of zygosity, allowing prioritization of cases for genotyping as well as identification of cases of probable laboratory error. Out of 242 twins (from 121 like-sex pairs) where genotypic data were available for zygosity confirmation, only a single case was identified of incorrect zygosity assignment by the latent class algorithm. Zygosity assignment for that single case was identified by the LCA as uncertain (probability of being a monozygotic twin only 76%), and the co-twin's responses clearly identified the pair as dizygotic (probability of being dizygotic 100%). In the absence of genotypic data, or as a safeguard against sample duplication, application of LCA for zygosity assignment or confirmation is strongly recommended.