879 resultados para whether court should, or could, make orders about basis of assessment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a legitimate assertion that the common ground of work of worth in architecture, whether theoretical or built comes from a firmly held position on the part of the author. In addition to delivery key competencies architectural education should act to support the formation of such a position in the student, or to make students aware of the possibility of holding such a position.

It is with this in mind perhaps that intensive unit-based diploma and masters structures are increasingly becoming the standard structure for for schools of architecture across the UK. The strengths of such a structure are most evident when the school, either by virtue of financial strength or geographic location is able to attract a diverse range of contrasting positions to bear in the formation of these units. In effect the offering to the student is a short, intensive immersion into a clear line of thought based on the position of those running the unit. Research is channeled by those running the unit to the work of the students. A single cohort of students therefore is able to observe and understand a wide range of ways of thinking about the subject whether or not they are participants in a unit or not. It is axiomatic that where this structure is applied in the absence of these resources the result can be less helpful, individual units are differentiated not to reflect the interests of those running the unit but for the sake of difference as its own end.

In structuring the M.Arch programme in Queens University Belfast the reality of our somewhat peripheral location was placed at the forefront of our considerations. A single 4 semester studio is offered. The first three semesters are carefully structured to offer a range of directed and self directed projects to the students. By interrogation of these projects, and work undertaken at undergraduate level the aim is to assist the students to identify a personal position on architecture, which is then developed in the thesis in semester four. Research and design outputs are emergent from the interest of the student body, cultivated by staff who have the time over the four semesters to get to know all aspects of a students interests.

This paper will lay out this structure and some of the projects run within it. Now having delivered two graduating years the successes and challenges of the system will be laid out by reference to several case studies of individual student experiences of the structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE, AIMS AND OBJECTIVES: Health care services offered to the public should be based on the best available evidence. We aimed to explore pharmacy tutors' and trainees' views on the importance of evidence when making decisions about over-the-counter (OTC) medicines and also to investigate whether the tutor influenced the trainee in practice.

METHODS: Following ethical approval and piloting, semi-structured interviews were conducted with pharmacy graduates (trainees) and pharmacist tutors. Transcribed interview data were entered into the NVivo software package (version 10), coded and analysed via thematic analysis.

RESULTS: Twelve trainees (five males, seven females) and 11 tutors (five males, six females) participated. Main themes that emerged were (in)consistency and contradiction, confidence, acculturation, and continuation and perpetuation. Despite having an awareness of the importance and potential benefits, an evidence-based approach did not seem to be routinely or consistently implemented in practice. Confidence in products was largely derived from personal use and patient feedback. A lack of discussion about evidence was justified on the basis of not wanting to lessen patient confidence in requested product(s) or possibly negating the placebo effect. Trainees became acculturated to 'real-life' practice; university teaching and evidence was deemed less relevant than meeting customer expectations. The tutor's actions were mirrored by their trainee resulting in continuation and perpetuation of the same professional attitudes and behaviours.

CONCLUSIONS: Evidence appeared to have limited influence on OTC decision making. The tutor played a key role in the trainee's professional development. More work could be performed to investigate how evidence can be regarded as relevant and something that is consistently implemented in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background

Although the General Medical Council recommends that United Kingdom medical students are taught ‘whole person medicine’, spiritual care is variably recognised within the curriculum. Data on teaching delivery and attainment of learning outcomes is lacking. This study ascertained views of Faculty and students about spiritual care and how to teach and assess competence in delivering such care.

Methods

A questionnaire comprising 28 questions exploring attitudes to whole person medicine, spirituality and illness, and training of healthcare staff in providing spiritual care was designed using a five-point Likert scale. Free text comments were studied by thematic analysis. The questionnaire was distributed to 1300 students and 106 Faculty at Queen’s University Belfast Medical School.

Results

351 responses (54 staff, 287 students; 25 %) were obtained. >90 % agreed that whole person medicine included physical, psychological and social components; 60 % supported inclusion of a spiritual component within the definition. Most supported availability of spiritual interventions for patients, including access to chaplains (71 %), counsellors (62 %), or members of the patient’s faith community (59 %). 90 % felt that personal faith/spirituality was important to some patients and 60 % agreed that this influenced health. However 80 % felt that doctors should never/rarely share their own spiritual beliefs with patients and 67 % felt they should only do so when specifically invited. Most supported including training on provision of spiritual care within the curriculum; 40-50 % felt this should be optional and 40 % mandatory. Small group teaching was the favoured delivery method. 64 % felt that teaching should not be assessed, but among assessment methods, reflective portfolios were most favoured (30 %). Students tended to hold more polarised viewpoints but generally were more favourably disposed towards spiritual care than Faculty. Respecting patients’ values and beliefs and the need for guidance in provision of spiritual care were identified in the free-text comments.

Conclusions

Students and Faculty generally recognise a spiritual dimension to health and support provision of spiritual care to appropriate patients. There is lack of consensus whether this should be delivered by doctors or left to others. Spiritual issues impacting patient management should be included in the curriculum; agreement is lacking about how to deliver and assess.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two sets of issues in the area of law and religion have generated a large share of attention and controversy across a wide number of countries and jurisdictions in recent years. The first set of issues relates to the autonomy of churches and other religiously affiliated entities such as schools and social service organisations in their hiring and personnel decisions, involving the question of how far, if at all, such entities should be free from the influence and oversight of the state. The second set of issues involves the presence of religious symbols in the public sphere, such as in state schools or on public lands, involving the question of how far the state should be free from the influence of religion. Although these issues – freedom of religion from the state, and freedom of the state from religion – could be viewed as opposite sides of the same coin, they are almost always treated as separate lines of inquiry, and the implications of each for the other have not been the subject of much scrutiny. In this Introduction, we consider whether insights might be drawn from thinking about these issues both from a comparative law perspective and also from considering these two lines of cases together.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pine wilt disease (PWD) is perhaps the most serious threat to pine forests worldwide. Since it´s discovery in the early XXth century by Japanese forest researchers, and the relationship with its causative agent, the pinewood nematode (PWN) Bursaphelenchus xylophilus, in the 1970s, PWD has wreaked havoc wherever it appears. Firstly in the Far East (Japan, China and Korea) and now, more recently in 1999, in the EU (Portugal). The forest sector in Portugal plays a major role in the Portuguese economy with a 12% contribution to the industrial gross domestic product, 3.2% of the gross domestic product, 10% of foreign trade and 5% of national employment. Maritime pine (Pinus pinaster) is one of the most important pine productions, and industrial activity, such as the production of wood and resin, as well as coastal protection associated with sand dunes. Also, stone pine (Pinus pinea) plays an important role in the economy with a share derived from the exports of high-quality pineon seed. Thus, the tremendous economical and ecological impact of the introduction of a pest and pathogen such as the PWN, although as far as is known, the only species susceptible to the nematode is maritime pine. Immediately following detection, the research team involved (Univ. Évora, INIAP) informed the national plant quarantine and forest authorities, which relayed the information to Brussels and the appropriate EU authorities. A task force (GANP), followed by a national program (PROLUNP) was established. Since then, national surveys have been taking place, involving MADRP (Ministry of Agriculture), the University of Évora and several private corporations (e.g. UNAC). Forest growers in the area are particularly interested and involved since the area owned by the growers organizations totals 700 000 ha, largely affected by PWD. Detection of the disease has led to serious consequences and restrictions regarding exploration and commercialization of wood. A precautionary phytosanitary strip, 3 km-wide, has been recently (2007) established surrounding the affected area. The Portuguese government, through its national program PROLUNP, has been deeply involved since 1999, and in conjunction with the EU (Permanent Phytosanitary Committee, and FVO) and committed to controlling this nematode and the potential spread to the rest of the country and to the rest of the EU. The global impact of the presence of Bursaphelenchus xylophilus or the threat of its introduction and the resulting pine wilt disease in forested areas in different parts of the world is of increasing concern economically. The concern is exacerbated by the prevailing debate on climate change and the putative impact this could have on the vulnerability of the world’s pine forests to this disease. The scientific and regulatory approach taken in different jurisdictions to the threat of pine wilt disease varies from country to country depending on the perceived vulnerability of their pine forests to the disease and/or to the economic cost due to lost trade in wood products. Much of the research surrounding pine wilt disease has been located in the northern hemisphere, especially in southern Europe and in the warmer, coastal, Asian countries. However, there is an increased focus on this problem also in those countries in the southern hemisphere where plantations of susceptible pine have been established over the years. The forestry sector in Australia and New Zealand are on “high alert” for this disease and are practicing strict quarantine procedures at all ports of entry for wood products. As well, there is heightened awareness, as there is worldwide, for the need to monitor wood packaging materials for all imported goods. In carrying out the necessary monitoring and assessment of products for B. xylophilus and its vectors substantial costs are incurred especially when decisions have to be made rapidly and regardless of whether the outcome is positive or negative. Australia’s response recently to the appearance of some dying pines in a plantation illustrated the high sensitivity of some countries to this disease. Some $200,000 was spent on the assessment in order to save a potential loss of millions of dollars to the disease. This rapid, co-ordinated response to the report was for naught, because once identified it was found not to be B. xylophilus. This illustrates the particular importance of taking the responsibility at all levels of management to secure the site and the need of a rapid, reliable diagnostic method for small nematode samples for use in the field. Australia is particularly concerned about the vulnerability of its 1million hectares of planted forests, 80% of which are Pinus species, to attack from incursions of one or more species of the insect vector. Monochamus alternatus incursions in wood pallets have been reported from Brisbane, Queensland. The climate of this part of Australia is such that the Pinus plantations are particularly vulnerable to the potential outcome of such incursions, and the state of Queensland is developing a risk management strategy and a proactive breeding programme in response to this putative threat. New Zealand has 1.6 million hectares of planted forests and 89% of the commercial forest is Pinus radiata. Although the climate where these forests are located tends to be somewhat cooler than that in Australia the potential for establishment and development of the disease in that country is believed to be high. The passage alone of 200,000 m³/year of wood packaging through New Zealand ports is itself sufficient to require response. The potential incursion of insect vectors of pinewood nematode through the port system is regarded as high and is monitored carefully. The enormous expansion of global trade and the continued use of unprocessed/inadequately-processed wood for packaging purposes is a challenge for all trading nations as such wood packaging material often harbours disease or pest species. The extent of this problem is readily illustrated by the expanding economies and exports of countries in south-east Asia. China. Japan and Korea have significant areas of forestland infested with B. xylophilus. These countries too are among the largest exporting countries of manufactured goods. Despite the attempts of authorities to ensure that only properly treated wood is used in the crating and packaging of goods B. xylophilus and/or its insect vector infested materials is being recorded at ports worldwide. This reminds us, therefore, of the ease with which this nematode pest can gain access to forest lands in new geographic locations through inappropriate use, treatment or monitoring of wood products. It especially highlights the necessity to find an alternative to using low-grade lumber for packaging purposes. Lest we should believe that all wood products are always carriers of B. xylophilus and its vectors, it should be remembered that international trade of all kinds has occurred for thousands of years and that lumber-born pests and diseases do not have worldwide distribution. Other physico-biological factors have a significant role in the occurrence, establishment and sustainability of a disease. The question is often raised as to why the whole of southern Europe doesn’t already have B. xylophilus and pine wilt disease. European countries have traded with countries that are infested with B. xylophilus for hundreds of years. Turkey is an example of a country that appears to be highly vulnerable to pine wilt disease due to its extensive forests in the warm, southern region where the vector, Monochamus galloprovincialis, occurs. However, there is no record of the presence of B. xylophilus occurring there despite the importation of substantial quantities of wood from several countries In many respects, Portugal illustrates both the challenge and the dilemma. In recent times B. xylophilus was discovered there in the warm coastal region. The research, administrative and quarantine authorities responded rapidly and B. xylophilus appears to have been confined to the region in which it was found. The rapid response would seem to have “saved the day” for Portugal. Nevertheless, it raises again the long-standing questions, how long had B. xylophilus been in Portugal before it was found? If Lisbon was the port of entry, which seems very likely, why had B. xylophilus not entered Lisbon many years earlier and established populations and the pine wilt disease? Will the infestation in Portugal be sustainable and will it spread or will it die out within a few years? We still do not have sufficient understanding of the biology of this pest to know the answers to these questions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: This paper explores the impact of academic scholarship on the development and practice of experienced managers. Design / Methodology: Semi-structured interviews with experienced managers, modelled on the critical incident technique. ‘Intertextuality’ and framework analysis technique are used to examine whether the use of academic scholarship is a sub-conscious phenomenon. Findings: Experienced managers make little direct use of academic scholarship, using it only occasionally to provide retrospective confirmation of decisions or a technique they can apply. However, academic scholarship informs their practice in an indirect way, their understanding of the ‘gist’ of scholarship comprising one of many sources which they synthesise and evaluate as part of their development process. Practical implications: Managers and management development practitioners should focus upon developing skills of synthesising the ‘gist’ of academic scholarship with other sources of data, rather than upon the detailed remembering, understanding and application of specific scholarship, and upon finding / providing the time and space for that ‘gisting’ and synthesis to take place. Originality / Value: The paper addresses contemporary concerns about the appropriateness of the material delivered on management education programmes for management development. It is original in doing this from the perspective of experienced managers, and in using intertextual analysis to reveal not only the direct but also the indirect uses of they make of such scholarship. The finding of the importance of understanding the ‘gist’ rather than the detail of academic theory represents a key conceptual innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consabido que para uma sociedade organizada se desenvolver política e juridicamente, indispensável se faz a existência de um documento formal, dotado de observância obrigatória, capaz de definir as competências públicas e delimitar os poderes do Estado, resguardando os direitos fundamentais de eventuais abusos dos entes políticos. Este documento é a Constituição, que, em todos os momentos da história, sempre se fez presente nos Estados, mas, inicialmente, não de forma escrita, o que fez com que surgisse, então, o constitucionalismo, movimento que defendia a necessidade de elaboração de constituições escritas, munidas de normatividade e supremacia em relação às demais espécies normativas, que visassem organizar a separação dos poderes estatais e declarar os direitos e as liberdades individuais. Porém, de nada adiantaria a edição de uma Lei Maior sem que houvesse mecanismos de defesa, no intuito de afastar qualquer ameaça à segurança jurídica e à estabilidade social, por conta de alguma lei ou ato normativo contrário aos preceitos estabelecidos na Constituição. O controle de constitucionalidade, pilar do Estado de Direito, consiste em verificar a compatibilidade entre uma lei ou qualquer ato normativo infraconstitucional e a Lei Excelsa e, em havendo contraste, a lei ou o ato viciado deverá ser expurgado do ordenamento jurídico, para que a unidade constitucional seja restabelecida. No Brasil, o controle de constitucionalidade foi instituído sob forte influência do modelo norte-americano e obteve diversos tratamentos ao longo das constituições brasileiras, porém, o sistema de fiscalização de constitucionalidade teve seu ápice com o advento da atual Constituição Federal, promulgada em 05.10.88, com a criação de instrumentos processuais inovadores destinados à verificação da constitucionalidade das leis e atos normativos. Além disso, a Carta da República de 1988, ao contrário das anteriores, fortaleceu a figura do Poder Judiciário no contexto político, conferindo, assim, maior autonomia aos magistrados na solução de casos de grande repercussão nacional, redundando em um protagonismo judicial atual. Nesse contexto, o Supremo Tribunal Federal, órgão de cúpula do Judiciário nacional e guardião da Constituição, tem se destacado no cenário nacional, em especial na defesa dos direitos e garantias fundamentais insculpidos na Lei Fundamental, fazendo-se necessária, desta forma, uma análise na jurisprudência da Corte, no sentido de verificar se, de fato, tem havido evolução no controle de constitucionalidade no Brasil ao longo dos últimos anos e, em caso afirmativo, em que circunstâncias isso tem se dado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new Physiotherapy and Occupational Therapy programmes, based in the Faculty of Health Sciences, McMaster University (Hamilton, Ontario) are unique. The teaching and learning philosophies utilized are based on learner-centred and selfdirected learning theories. The 1991 admissions process of these programmes attempted to select individuals who would make highly qualified professionals and who would have the necessary skills to complete such unique programmes. In order to: 1 . learn more about the concept of self-directed learning and its related characteristics in health care professionals; 2. examine the relationship between various student characteristics - personal, learner and those assessed during the admissions process - and final course grades, and 3. determine which, if any, smdent characteristics could be considered predictors for success in learner-centred programmes requiring self-directed learning skills, a correlational research design was developed and carried out. Thirty Occupational Therapy and thirty Physiotherapy smdents were asked to complete 2 instruments - a questionnaire developed by the author and the Oddi Continuing Learning Inventory (Oddi, 1986). Course grades and ratings of students during the admissions process were also obtained. Both questionnaires were examined for reliability, and factor analyses were conducted to determine construct validity. Data obtained from the questionnaires, course grades and student ratings (from the admissions process) were analyzed and compared using the Contingency Co-efficient, the Pearson's product-moment correlation co-efficient, and the multiple regression analysis model. The research findings demonstrated a positive relationship (as identified by Contingency Coefficient or Pearson r values) between various course grades and the following personal and learner characteristics: field of smdy of highest level of education achieved, level of education achieved, sex, marital stams, motivation for completing the programmes, reasons for eru-oling in the programmes, decision to enrol in the programmes, employment history, preferred learning style, strong selfconcept and the identification of various components of the concept of self-directed learning. In most cases, the relationships were significant to the 0.01 or 0.(X)1 levels. Results of the multiple regression analyses demonstrated that several learner and admissions characteristic variables had R^ values that accounted for the largest proportion of the variance in several dependent variables. Thus, these variables could be considered predictors for success. The learner characteristics included: level of education and strong self-concept. The admissions characteristics included: ability to evaluate strengths, ability to give feedback, curiosity and creativity, and communication skills. It is recommended that research continue to be conducted to substantiate the relationships found between course grades and characteristic variables in more diverse populations. "Success in self-directed programmes" from the learner's perspective should also be investigated. The Oddi Continuing Learning Inventory should continue to be researched. Further research may lead to refinement or further development of the instrument, and may provide further insight into self-directed learner attributes. The concept of self-directed learning continues to be incorporated into educational programmes, and thus should continue to be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction The question of the meaning, methods and philosophical manifestations of history is currently rife with contention. The problem that I will address in an exposition of the thought of Wilhelm Dilthey and Martin Heidegger, centers around the intersubjectivity of an historical world. Specifically, there are two interconnected issues. First, since all knowledge occurs to a person from within his or her historical age how can any person in any age make truth claims? In order to answer this concern we must understand the essence and role of history. Yet how can we come to an individual understanding ofwhat history is when the meanings that we use are themselves historically enveloped? But can we, we who are well aware of the knowledge that archaeology has dredged up from old texts or even from 'living' monuments of past ages, really neglect to notice these artifacts that exist within and enrich our world? Charges of wilful blindness would arise if any attempt were made to suggest that certain things of our world did not come down to us from the past. Thus it appears more important 2 to determine what this 'past' is and therefore how history operates than to simply derail the possibility for historical understanding. Wilhelm Dilthey, the great German historicist from the 19th century, did not question the existence of historical artifacts as from the past, but in treating knowledge as one such artifact placed the onus on knowledge to show itself as true, or meaningful, in light ofthe fact that other historical periods relied on different facts and generated different truths or meanings. The problem for him was not just determining what the role of history is, but moreover to discover how knowledge could make any claim as true knowledge. As he stated, there is a problem of "historical anarchy"!' Martin Heidegger picked up these two strands of Dilthey's thought and wanted to answer the problem of truth and meaning in order to solve the problem of historicism. This problem underscored, perhaps for the first time, that societal presuppositions about the past and present oftheir era are not immutable. Penetrating to the core of the raison d'etre of the age was an historical reflection about the past which was now conceived as separated both temporally and attitudinally from the present. But further than this, Heidegger's focus on asking the question of the meaning of Being meant that history must be ontologically explicated not merely ontically treated. Heidegger hopes to remove barriers to a genuine ontology by II 1 3 including history into an assessment ofprevious philosophical systems. He does this in order that the question of Being be more fully explicated, which necessarily for him includes the question of the Being of history. One approach to the question ofwhat history is, given the information that we get from historical knowledge, is whether such knowledge can be formalized into a science. Additionally, we can approach the question of what the essence and role of history is by revealing its underlying characteristics, that is, by focussing on historicality. Thus we will begin with an expository look at Dilthey's conception of history and historicality. We will then explore these issues first in Heidegger's Being and Time, then in the third chapter his middle and later works. Finally, we shall examine how Heidegger's conception may reflect a development in the conception of historicality over Dilthey's historicism, and what such a conception means for a contemporary historical understanding. The problem of existing in a common world which is perceived only individually has been philosophically addressed in many forms. Escaping a pure subjectivist interpretation of 'reality' has occupied Western thinkers not only in order to discover metaphysical truths, but also to provide a foundation for politics and ethics. Many thinkers accept a solipsistic view as inevitable and reject attempts at justifying truth in an intersubjective world. The problem ofhistoricality raises similar problems. We 4 -. - - - - exist in a common historical age, presumably, yet are only aware ofthe historicity of the age through our own individual thoughts. Thus the question arises, do we actually exist within a common history or do we merely individually interpret this as communal? What is the reality of history, individual or communal? Dilthey answers this question by asserting a 'reality' to the historical age thus overcoming solipsism by encasing individual human experience within the historical horizon of the age. This however does nothing to address the epistemological concern over the discoverablity of truth. Heidegger, on the other hand, rejects a metaphysical construel of history and seeks to ground history first within the ontology ofDasein, and second, within the so called "sending" of Being. Thus there can be no solipsism for Heidegger because Dasein's Being is necessarily "cohistorical", Being-with-Others, and furthermore, this historical-Being-in-the-worldwith- Others is the horizon of Being over which truth can appear. Heidegger's solution to the problem of solipsism appears to satisfy that the world is not just a subjective idealist creation and also that one need not appeal to any universal measures of truth or presumed eternal verities. Thus in elucidating Heidegger's notion of history I will also confront the issues ofDasein's Being-alongside-things as well as the Being of Dasein as Being-in-the-world so that Dasein's historicality is explicated vis-a-vis the "sending of Being" (die Schicken des S eins).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is our intention in the course of the development of this thesis to give an account of how intersubjectivity is "eidetically" constituted by means of the application of the phenomenological reduction to our experience in the context of the thought of Edmund Husserl; contrasted with various representative thinkers in what H. Spiegelberg refers to as "the wider scene" of phenomenology. That is to say, we intend to show those structures of both consciousness and the relation which man has to the world which present themselves as the generic conditions for the possibility of overcoming our "radical sol itude" in order that we may gain access to the mental 1 ife of an Other as other human subject. It is clear that in order for us to give expression to these accounts in a coherent manner, along with their relative merits, it will be necessary to develop the common features of any phenomenological theory of consdousness whatever. Therefore, our preliminary inquiry, subordinate to the larger theme, shall be into some of the epistemological results of the application of the phenomenological method used to develop a transcendental theory of consciousness. Inherent in this will be the deliniation of the exigency for making this an lIintentional ll theory. We will then be able to see how itis possible to overcome transcendentally the Other as an object merely given among other merely given objects, and further, how this other is constituted specifically as other ego. The problem of transcendental intersubjectivity and its constitution in experience can be viewed as one of the most compelling, if not the most polemical of issues in phenomenology. To be sure, right from the beginning we are forced to ask a number of questions regarding Husserl's responses to the problem within the context of the methodological genesis of the Cartesian Meditations, and The Crisis of European Sciences and Transcendental Phenomenology. This we do in order to set the stage for amplification. First, we ask, has Husserl lived up to his goal, in this connexion, of an apodictic result? We recall that in his Logos article of 1911 he adminished that previous philosophy does not have at its disposal a merely incomplete and, in particular instances, imperfect doctrinal system; it simply has none whatever. Each and every question is herein controverted, each position is a matter of individual conviction, of the interpretation given byaschool, of a "point of view". 1. Moreover in the same article he writes that his goal is a philosophical system of doctrine that, after the gigantic preparatory work. of generations, really be- . gins from the ground up with a foundation free from doubt and rises up like any skilful construction, wherein stone is set upon store, each as solid as the other, in accord with directive insights. 2. Reflecting upon the fact that he foresaw "preparatory work of generations", we perhaps should not expect that he would claim that his was the last word on the matter of intersubjectivity. Indeed, with 2. 'Edmund Husserl, lIPhilosophy as a Rigorous Science" in Phenomenology and theCrisis6fPhilosophy, trans". with an introduction by Quentin Lauer (New York.: Harper & Row, 1965) pp. 74 .. 5. 2Ibid . pp. 75 .. 6. 3. the relatively small amount of published material by Husserl on the subject we can assume that he himself was not entirely satisfied with his solution. The second question we have is that if the transcendental reduction is to yield the generic and apodictic structures of the relationship of consciousness to its various possible objects, how far can we extend this particular constitutive synthetic function to intersubjectivity where the objects must of necessity always remain delitescent? To be sure, the type of 'object' here to be considered is unlike any other which might appear in the perceptual field. What kind of indubitable evidence will convince us that the characteristic which we label "alter-ego" and which we attribute to an object which appears to resemble another body which we have never, and can never see the whole of (namely, our own bodies), is nothing more than a cleverly contrived automaton? What;s the nature of this peculiar intentional function which enables us to say "you think just as I do"? If phenomenology is to take such great pains to reduce the takenfor- granted, lived, everyday world to an immanent world of pure presentation, we must ask the mode of presentation for transcendent sub .. jectivities. And in the end, we must ask if Husserl's argument is not reducible to a case (however special) of reasoning by analogy, and if so, tf this type of reasoning is not so removed from that from whtch the analogy is made that it would render all transcendental intersubjective understandtng impos'sible? 2. HistoticalandEidetic Priority: The Necessity of Abstraction 4. The problem is not a simple one. What is being sought are the conditions for the poss ibili:ty of experi encing other subjects. More precisely, the question of the possibility of intersubjectivity is the question of the essence of intersubjectivity. What we are seeking is the absolute route from one solitude to another. Inherent in this programme is the ultimate discovery of the meaning of community. That this route needs be lIabstract" requires some explanation. It requires little explanation that we agree with Husserl in the aim of fixing the goal of philosophy on apodictic, unquestionable results. This means that we seek a philosophical approach which is, though, not necessarily free from assumptions, one which examines and makes explicit all assumptions in a thorough manner. It would be helpful at this point to distinguish between lIeidetic ll priority, and JlhistoricallJpriority in order to shed some light on the value, in this context, of an abstraction.3 It is true that intersubjectivity is mundanely an accomplished fact, there havi.ng been so many mi.llions of years for humans to beIt eve in the exi s tence of one another I s abili ty to think as they do. But what we seek is not to study how this proceeded historically, but 3Cf• Maurice Natanson;·TheJburne in 'Self, a Stud in Philoso h and Social Role (Santa Cruz, U. of California Press, 1970 . rather the logical, nay, "psychological" conditions under which this is possible at all. It is therefore irrelevant to the exigesis of this monograph whether or not anyone should shrug his shoulders and mumble IIwhy worry about it, it is always already engaged". By way of an explanation of the value of logical priority, we can find an analogy in the case of language. Certainly the language 5. in a spoken or written form predates the formulation of the appropriate grammar. However, this grammar has a logical priority insofar as it lays out the conditions from which that language exhibits coherence. The act of formulating the grammar is a case of abstraction. The abstraction towards the discovery of the conditions for the poss; bi 1 ity of any experiencing whatever, for which intersubjective experience is a definite case, manifests itself as a sort of "grammar". This "grammar" is like the basic grammar of a language in the sense that these "rulesil are the ~ priori conditions for the possibility of that experience. There is, we shall say, an "eidetic priority", or a generic condition which is the logical antecedent to the taken-forgranted object of experience. In the case of intersubjectivity we readily grant that one may mundanely be aware of fellow-men as fellowmen, but in order to discover how that awareness is possible it is necessary to abstract from the mundane, believed-in experience. This process of abstraction is the paramount issue; the first step, in the search for an apodictic basis for social relations. How then is this abstraction to be accomplished? What is the nature of an abstraction which would permit us an Archimedean point, absolutely grounded, from which we may proceed? The answer can be discovered in an examination of Descartes in the light of Husserl's criticism. 3. The Impulse for Scientific Philosophy. The Method to which it Gives Rise. 6. Foremost in our inquiry is the discovery of a method appropriate to the discovery of our grounding point. For the purposes of our investigations, i.e., that of attempting to give a phenomenological view of the problem of intersubjectivity, it would appear to be of cardinal importance to trace the attempt of philosophy predating Husserl, particularly in the philosophy of Descartes, at founding a truly IIscientific ll philosophy. Paramount in this connexion would be the impulse in the Modern period, as the result of more or less recent discoveries in the natural sciences, to found philosophy upon scientific and mathematical principles. This impulse was intended to culminate in an all-encompassing knowledge which might extend to every realm of possible thought, viz., the universal science ot IIMathexis Universalis ll •4 This was a central issue for Descartes, whose conception of a universal science would include all the possible sciences of man. This inclination towards a science upon which all other sciences might be based waS not to be belittled by Husserl, who would appropriate 4This term, according to Jacab Klein, was first used by Barocius, the translator of Proclus into Latin, to designate the highest mathematical discipline. . 7. it himself in hopes of establishing, for the very first time, philosophy as a "rigorous science". It bears emphasizing that this in fact was the drive for the hardening of the foundations of philosophy, the link between the philosophical projects of Husserl and those of the philosophers of the modern period. Indeed, Husserl owes Descartes quite a debt for indicating the starting place from which to attempt a radical, presupositionless, and therefore scientific philosophy, in order not to begin philosophy anew, but rather for the first time.5 The aim of philosophy for Husserl is the search for apodictic, radical certitude. However while he attempted to locate in experience the type of necessity which is found in mathematics, he wished this necessity to be a function of our life in the world, as opposed to the definition and postulation of an axiomatic method as might be found in the unexpurgated attempts to found philosophy in Descartes. Beyond the necessity which is involved in experiencing the world, Husserl was searching for the certainty of roots, of the conditi'ons which underl ie experience and render it pOssible. Descartes believed that hi~ MeditatiOns had uncovered an absolute ground for knowledge, one founded upon the ineluctable givenness of thinking which is present even when one doubts thinking. Husserl, in acknowledging this procedure is certainly Cartesian, but moves, despite this debt to Descartes, far beyond Cartesian philosophy i.n his phenomenology (and in many respects, closer to home). 5Cf. Husserl, Philosophy as a Rigorous Science, pp. 74ff. 8 But wherein lies this Cartesian jumping off point by which we may vivify our theme? Descartes, through inner reflection, saw that all of his convictions and beliefs about the world were coloured in one way or another by prejudice: ... at the end I feel constrained to reply that there is nothing in a all that I formerly believed to be true, of which I cannot in some measure doubt, and that not merely through want of thought or through levity, but for reasons which are very powerful and maturely considered; so that henceforth I ought not the less carefully to refrain from giving credence to these opinions than to that which is manifestly false, if I desire to arrive at any certainty (in the sciences). 6 Doubts arise regardless of the nature of belief - one can never completely believe what one believes. Therefore, in order to establish absolutely grounded knowledge, which may serve as the basis fora "universal Science", one must use a method by which one may purge oneself of all doubts and thereby gain some radically indubitable insight into knowledge. Such a method, gescartes found, was that, as indicated above by hi,s own words, of II radical doubt" which "forbids in advance any judgemental use of (previous convictions and) which forbids taking any position with regard to their val idi'ty. ,,7 This is the method of the "sceptical epoche ll , the method of doubting all which had heretofor 6Descartes,Meditations on First Philosophy, first Med., (Libera 1 Arts Press, New York, 1954) trans. by L. LaFl eur. pp. 10. 7Husserl ,CrisiS of Eliroeari SCiences and Trariscendental Phenomenology, (Northwestern U. Press, Evanston, 1 7 ,p. 76. 9. been considered as belonging to the world, including the world itself. What then is left over? Via the process of a thorough and all-inclusive doubting, Descartes discovers that the ego which performs the epoche, or "reduction", is excluded from these things which can be doubted, and, in principle provides something which is beyond doubt. Consequently this ego provides an absolute and apodictic starting point for founding scientific philosophy. By way of this abstention. of bel ief, Desca'rtes managed to reduce the worl d of everyday 1 ife as bel ieved in, to mere 'phenomena', components of the rescogitans:. Thus:, having discovered his Archimedean point, the existence of the ego without question, he proceeds to deduce the 'rest' of the world with the aid of innate ideas and the veracity of God. In both Husserl and Descartes the compelling problem is that of establ ishing a scientific, apodictic phi'losophy based upon presuppos itionless groundwork .. Husserl, in thi.s regard, levels the charge at Descartes that the engagement of his method was not complete, such that hi.S: starting place was not indeed presupositionless, and that the validity of both causality and deductive methods were not called into question i.'n the performance of theepoche. In this way it is easy for an absolute evidence to make sure of the ego as: a first, "absolute, indubitablyexisting tag~end of the worldll , and it is then only a matter of inferring the absolute subs.tance and the other substances which belon.g to the world, along with my own mental substance, using a logically val i d deductive procedure. 8 8Husserl, E.;' Cartesian 'Meditation;, trans. Dorion Cairns (Martinus Nijhoff, The Hague, 1970), p. 24 ff.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Canada freedom of information must be viewed in the context of governing -- how do you deal with an abundance of information while balancing a diversity of competing interests? How can you ensure people are informed enough to participate in crucial decision-making, yet willing enough to let some administrative matters be dealt with in camera without their involvement in every detail. In an age when taxpayers' coalition groups are on the rise, and the government is encouraging the establishment of Parent Council groups for schools, the issues and challenges presented by access to information and protection of privacy legislation are real ones. The province of Ontario's decision to extend freedom of information legislation to local governments does not ensure, or equate to, full public disclosure of all facts or necessarily guarantee complete public comprehension of an issue. The mere fact that local governments, like school boards, decide to collect, assemble or record some information and not to collect other information implies that a prior decision was made by "someone" on what was important to record or keep. That in itself means that not all the facts are going to be disclosed, regardless of the presence of legislation. The resulting lack of information can lead to public mistrust and lack of confidence in those who govern. This is completely contrary to the spirit of the legislation which was to provide interested members of the community with facts so that values like political accountability and trust could be ensured and meaningful criticism and input obtained on matters affecting the whole community. This thesis first reviews the historical reasons for adopting freedom of information legislation, reasons which are rooted in our parliamentary system of government. However, the same reasoning for enacting such legislation cannot be applied carte blanche to the municipal level of government in Ontario, or - ii - more specifially to the programs, policies or operations of a school board. The purpose of this thesis is to examine whether the Municipal Freedom of Information and Protection of Privacy Act, 1989 (MFIPPA) was a neccessary step to ensure greater openness from school boards. Based on a review of the Orders made by the Office of the Information and Privacy Commissioner/Ontario, it also assesses how successfully freedom of information legislation has been implemented at the municipal level of government. The Orders provide an opportunity to review what problems school boards have encountered, and what guidance the Commissioner has offered. Reference is made to a value framework as an administrative tool in critically analyzing the suitability of MFIPPA to school boards. The conclusion is drawn that MFIPPA appears to have inhibited rather than facilitated openness in local government. This may be attributed to several factors inclusive of the general uncertainty, confusion and discretion in interpreting various provisions and exemptions in the Act. Some of the uncertainty is due to the fact that an insufficient number of school board staff are familiar with the Act. The complexity of the Act and its legalistic procedures have over-formalized the processes of exchanging information. In addition there appears to be a concern among municipal officials that granting any access to information may be violating personal privacy rights of others. These concerns translate into indecision and extreme caution in responding to inquiries. The result is delay in responding to information requests and lack of uniformity in the responses given. However, the mandatory review of the legislation does afford an opportunity to address some of these problems and to make this complex Act more suitable for application to school boards. In order for the Act to function more efficiently and effectively legislative changes must be made to MFIPPA. It is important that the recommendations for improving the Act be adopted before the government extends this legislation to any other public entities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large variety of social signals, such as facial expression and body language, are conveyed in everyday interactions and an accurate perception and interpretation of these social cues is necessary in order for reciprocal social interactions to take place successfully and efficiently. The present study was conducted to determine whether impairments in social functioning that are commonly observed following a closed head injury, could at least be partially attributable to disruption in the ability to appreciate social cues. More specifically, an attempt was made to determine whether face processing deficits following a closed head injury (CHI) coincide with changes in electrophysiological responsivity to the presentation of facial stimuli. A number of event-related potentials (ERPs) that have been linked specifically to various aspects of visual processing were examined. These included the N170, an index of structural encoding ability, the N400, an index of the ability to detect differences in serially presented stimuli, and the Late Positivity (LP), an index of the sensitivity to affective content in visually-presented stimuli. Electrophysiological responses were recorded while participants with and without a closed head injury were presented with pairs of faces delivered in a rapid sequence and asked to compare them on the basis of whether they matched with respect to identity or emotion. Other behavioural measures of identity and emotion recognition were also employed, along with a small battery of standard neuropsychological tests used to determine general levels of cognitive impairment. Participants in the CHI group were impaired in a number of cognitive domains that are commonly affected following a brain injury. These impairments included reduced efficiency in various aspects of encoding verbal information into memory, general slower rate of information processing, decreased sensitivity to smell, and greater difficulty in the regulation of emotion and a limited awareness of this impairment. Impairments in face and emotion processing were clearly evident in the CHI group. However, despite these impairments in face processing, there were no significant differences between groups in the electrophysiological components examined. The only exception was a trend indicating delayed N170 peak latencies in the CHI group (p = .09), which may reflect inefficient structural encoding processes. In addition, group differences were noted in the region of the N100, thought to reflect very early selective attention. It is possible, then, that facial expression and identity processing deficits following CHI are secondary to (or exacerbated by) an underlying disruption of very early attentional processes. Alternately the difficulty may arise in the later cognitive stages involved in the interpretation of the relevant visual information. However, the present data do not allow these alternatives to be distinguished. Nonetheless, it was clearly evident that individuals with CHI are more likely than controls to make face processing errors, particularly for the more difficult to discriminate negative emotions. Those working with individuals who have sustained a head injury should be alerted to this potential source of social monitoring difficulties which is often observed as part of the sequelae following a CHI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’expérience de souffrance des vieux, en perte d’autonomie physique, vivant dans un CHSLD comprend deux dimensions : la souffrance et les vieux. Les deux sont liées. L’hypothèse sur le sens de la souffrance tient compte de celui que les vieux ont donné et donnent à leur vie. Le sens de la souffrance dépend de celui de la vie. Selon qu’on est plutôt individualiste, humaniste agnostique ou humaniste religieux, le sens de la souffrance prend une couleur particulière. Tour à tour, le mémoire examine les deux parties du problème de recherche, dresse un portrait des vieux de l’an 2008, propose un fondement théorique au projet de recherche, établit un arrimage de sens entre la souffrance des vieux et le sens de leur vie. La vie des vieux en CHSLD est en discontinuité avec leur existence antérieure : leurs valeurs et leur rythme de vie sont remis en question. Leur présence dans une résidencesubstitut invite à une réflexion sur la place des vieux dans la société individualiste contemporaine et sur l’humanisation des services. Comment concilier individualisme et humanisation ? Comment vivre avec la perte de son autonomie, une souffrance globale, un certain isolement, ... ? Autant de sujets et d’enjeux qui interrogent l’ensemble de la société. Les personnes âgées réclament un entourage empathique, des intervenants dynamiques, des politiques de santé qui font de ces centres de vrais milieux de vie et de soins. Il s’agit d’une responsabilité collective face au mouvement d’exclusion sociale.