824 resultados para Teaching, Freedom of.
Resumo:
OBJECTIVES: The treatment of recurrent rejection in heart transplant recipients has been a controversial issue for many years. The intent of this retrospective study was to perform a risk-benefit analysis between treatment strategies with bolus steroids only versus anti-thymocyte globulins (RATG; 1.5 mg/kg q 4 days). METHODS: Between 1986 and 1993, 69 of 425 patients (17 male, 52 female; mean age 44 +/- 11 years) who had more than one rejection/patient per month (rej/pt per mo) in the first 3 postoperative months were defined as recurrent rejectors. RESULTS: Repetitive methylprednisolone bolus therapy (70 mg/kg q 3 days) was given in 27 patients (group M; 1.4 +/- 0.2 rej/pt per mo) and RATG therapy for one of the rejection episodes of the 42 remaining patients (group A; 1.5 +/- 0.2 rej/pt per mo). The quality of triple drug immunosuppression in the two study groups was comparable. The rejection-free interval (RFI) following RATG treatment in group A was 21.6 +/- 10 days and 22 +/- 11 in group M. In group M, 3 of 27 patients (11%) had a rejection treatment-related infection (2 bacterial; 1 viral) versus 6 of the 42 patients of group A (14.2%; bacterial 1, viral 5). During postoperative months 3-24, 0.15 +/- 0.12 rej/pat per mo were observed in group M and 0.21 +/- 0.13 rej/pat per mo in group A (n.s.). In this 21-month period cytolytic therapy for rejection was initiated in 8 of the remaining 21 patients of group M (38%) and 15 of the remaining 37 patients of group A (40.5%). The absolute survival and the individual causes of death were not affected by the type of initial treatment of recurrent rejection. The actuarial freedom of graft atherosclerosis is comparable in the two groups with 78% in group A versus 79% in group M free of graft atherosclerosis at 3 years postoperatively. CONCLUSIONS: A comparison of cytolytic therapy versus repeated applications of bolus steroids for treatment of recurrent rejection reveals no significant difference in the long-term patient outcome with respect to the incidence of future rejection episodes and survival.
Resumo:
High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field
Resumo:
At large, research universities, a common approach for teaching hundreds of undergraduate students at one time is the traditional, large, lecture-based course. Trends indicate that over the next decade there will be an increase in the number of large, campus courses being offered as well as larger enrollments in courses currently offered. As universities investigate alternative means to accommodate more students and their learning needs, Web-based instruction provides an attractive delivery mode for teaching large, on-campus courses. This article explores a theoretical approach regarding how Web-based instruction can be designed and developed to provide quality education for traditional, on-campus, undergraduate students. The academic debate over the merit of Web-based instruction for traditional, on-campus students has not been resolved. This study identifies and discusses instructional design theory for adapting a large, lecture-based course to the Web.
Resumo:
This article discusses democratic elements in early Islamic sources and in the programs of the Algerian FIS (Front Islamique du Salut) and ANNAHDA in Tunesia. According to historic writings, Islam includes the principles of democratic consensus, consultation, and freedom of opinion, and an understanding that the sources of Islamic jurisdiction are subject to interpretation, that the sharia can be changed, and that religious authorities’ power to issue instructions on worldly matters is limited. These are the type of expectations that fundamentalist parties arouse when they speak of an Islamic caliphate as a state system. Against this background, an examination of the political system proposed until 1992 by the Algerian FIS shows that this system would have resulted in a very restrictive form of Islam. An investigation of the political system of the Tunisian fundamentalist leader Rached al-Ghannouchi reveals that the system he proposes may be designated as an Islamic democracy, since it takes into account separation of powers and pluralism of political parties. The head of state would be subject to the law in the same manner as the people. However, it is no liberal democracy, as he categorically rejects secularism, intends to punish apostates, and is only willing to allow political parties that are based on the religion of Islam. His state would only be a state of those citizens who follow Islam, completely neglecting secularist groups. Social conflicts and unrest are thus predetermined.
Resumo:
The report examines the relationship between day care institutions, schools and so called “parents unfamiliar to education” as well as the relationship between the institutions. With in Danish public and professional discourse concepts like parents unfamiliar to education are usually referring to environments, parents or families with either no or just very restricted experience of education except for the basic school (folkeskole). The “grand old man” of Danish educational research, Prof. Em. Erik Jørgen Hansen, defines the concept as follows: Parents who are distant from or not familiar with education, are parents without tradition of education and by that fact they are not able to contribute constructively in order to back up their own children during their education. Many teachers and pedagogues are not used to that term; they rather prefer concepts like “socially exposed” or “socially disadvantaged” parents or social classes or strata. The report does not only focus on parents who are not capable to support the school achievements of their children, since a low level of education is usually connected with social disadvantage. Such parents are often not capable of understanding and meeting the demands from side of the school when sending their children to school. They lack the competencies or the necessary competence of action. For the moment being much attention is done from side of the Ministries of Education and Social Affairs (recently renamed Ministry of Welfare) in order to create equal possibilities for all children. Many kinds of expertise (directions, counsels, researchers, etc.) have been more than eager to promote recommendations aiming at achieving the ambitious goal: 2015 95% of all young people should complement a full education (classes 10.-12.). Research results are pointing out the importance of increased participation of parents. In other word the agenda is set for ‘parents’ education’. It seems necessary to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and effects, and by implementing a new wage policy depending on achievements and/or effects a new system of accountability is manufactured. The consequences are already perceptible. The government decides to do some special interventions concerning parents, children or youngsters, the public servants on municipality level are instructed to carry out their services by following a manual, and the parents are no longer protected by privacy. Protection of privacy and minority is no longer a valuable argumentation to prevent further interventions in people’s life (health, food, school, etc.). The citizens are becoming objects of investment, also implying that people are investing in their own health, education, and family. This means that investments in changes of life style and development of competences go hand in hand. The below mentioned programmes are conditioned by this shift.
Resumo:
Additive manufacturing by melting of metal powders is an innovative method to create one-offs and customized parts. Branches like dentistry, aerospace engineering and tool making were indicated and the manufacturing methods are established. Besides all the advantages, like freedom of design, manufacturing without a tool and the reduction of time-to-market, there are however some disadvantages, such as reproducibility or the surface quality. The surface quality strongly depends on the orientation of the component in the building chamber, the process parameters which are laser power and exposure time, but also on the so-called “hatch”-strategy, which includes the way the laser exposes the solid areas. This paper deals with the investigation and characterization of the surface quality of generated parts produced by SLM. Main process parameters including part orientation, part size and hatch strategies are investigated and monitored. The outcome is a recommendation of suitable hatch strategies depending on desired part properties. This includes metered values and takes into account process stability and reproducibility.
Resumo:
On October 10, 2013, the Chamber of the European Court of Human Rights (ECtHR) handed down a judgment (Delfi v. Estonia) condoning Estonia for a law which, as interpreted, held a news portal liable for the defamatory comments of its users. Amongst the considerations that led the Court to find no violation of freedom of expression in this particular case were, above all, the inadequacy of the automatic screening system adopted by the website and the users’ option to post their comments anonymously (i.e. without need for prior registration via email), which in the Court’s view rendered the protection conferred to the injured party via direct legal action against the authors of the comments ineffective. Drawing on the implications of this (not yet final) ruling, this paper discusses a few questions that the tension between the risk of wrongful use of information and the right to anonymity generates for the development of Internet communication, and examines the role that intermediary liability legislation can play to manage this tension.
Resumo:
Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and state. Over the years, the Court has extended privacy protection to horizontal relations and has gradually accepted that individual autonomy is an equally important value underlying the right to privacy. However, in most of the recent cases regarding Article 8 ECHR, the Court goes beyond the protection of negative freedom and individual autonomy and instead focuses self-expression, personal development and human flourishing. Accepting this virtue ethical notion, in addition to the traditional Kantian focus on individual autonomy and human dignity, as a core value of Article 8 ECHR may prove vital for the protection of privacy in the age of Big Data.
Resumo:
During the last decades, the virtual world increasingly gained importance and in this context the enforcement of privacy rights became more and more difficult. An important emanation of this trend is the right to be forgotten enshrining the protection of the data subject’s rights over his/her “own” data. Even though the right to be forgotten has been made part of the proposal for a completely revised Data Protection Regulation and has recently been acknowledged by the Court of Justice of the European Union (“Google/Spain” decision), to date, the discussions about the right and especially its implementation with regard to the fundamental right to freedom of expression have remained rather vague and need to be examined in more depth.
Resumo:
In Europe, roughly three regimes apply to the liability of Internet intermediaries for privacy violations conducted by users through their network. These are: the e-Commerce Directive, which, under certain conditions, excludes them from liability; the Data Protection Directive, which imposes a number of duties and responsibilities on providers processing personal data; and the freedom of expression, contained inter alia in the ECHR, which, under certain conditions, grants Internet providers several privileges and freedoms. Each doctrine has its own field of application, but they also have partial overlap. In practice, this creates legal inequality and uncertainty, especially with regard to providers that host online platforms and process User Generated Content.
Resumo:
In 2009 Switzerland, for long an apparent beacon of European toleration and neutrality, voted to ban the erection of minarets. Internal religious matters are normally dealt with at the regional or local level – not at the level of the Swiss national parliament, although the state does seek to ensure good order and peaceful relations between different faith communities. Indeed, the freedom of these communities to believe and function publicly is enshrined in law. However, as a matter of national policy, now constitutionally embedded, one religious group, the Muslim group, is not permitted to build their distinctive religious edifice, the minaret. Switzerland may have joined the rest of Europe with respect to engaging the challenge of Islamic presence to European identity and values, but the rejection of a symbol of the presence of one faith – in this case, Islamic – by a society that is otherwise predominantly secular, pluralist, and of Christian heritage, poses significant concerns. How and why did this happen? What are the implications? This paper will discuss some of the issues involved, concluding the ban is by no means irreversible. Tolerant neutrality may yet again be a leitmotif of Swiss culture and not just of foreign policy.
Resumo:
Background Catheter ablation (CA) of ventricular tachycardia (VT) is an important treatment option in patients with structural heart disease (SHD) and implantable cardioverter defibrillator (ICD). A subset of patients requires epicardial CA for VT. Objective The purpose of the study was to assess the significance of epicardial CA in these patients after a systematic sequential endocardial approach. Methods Between January 2009 and October 2012 CA for VT was analyzed. A sequential CA approach guided by earliest ventricular activation, pacemap, entrainment and stimulus to QRS-interval analysis was used. Acute CA success was assessed by programmed ventricular stimulation. ICD interrogation and 24 h-Holter ECG were used to evaluate long-term success. Results One hundred sixty VT ablation procedures in 126 consecutive patients (114 men; age 65 ± 12 years) were performed. Endocardial CA succeeded in 250 (94%) out of 265 treated VT. For 15 (6%) VT an additional epicardial CA was performed and succeeded in 9 of these 15 VT. Long-term FU (25 ± 18.2 month) showed freedom of VT in 104 pts (82%) after 1.2 ± 0.5 procedures, 11 (9%) suffered from repeated ICD shocks and 11 (9%) died due to worsening of heart failure. Conclusions Despite a heterogenic substrate for VT in SHD, endocardial CA alone results in high acute success rates. In this study additional epicardial CA following a sequential endocardial mapping and CA approach was performed in 6% of VT. Thus, due to possible complications epicardial CA should only be considered if endocardial CA fails.
Resumo:
Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.
Resumo:
BACKGROUND In the meantime, catheter ablation is widely used for the treatment of persistent atrial fibrillation (AF). There is a paucity of data about long-term outcomes. This study evaluates (1) 5-year single and multiple procedure success and (2) prognostic factors for arrhythmia recurrences after catheter ablation of persistent AF using the stepwise approach aiming at AF termination. METHODS AND RESULTS A total of 549 patients with persistent AF underwent de novo catheter ablation using the stepwise approach (2007-2009). A total of 493 patients were included (Holter ECGs ≥ every 6 months). Mean follow-up was 59 ± 16 months with 2.1 ± 1.1 procedures per patient. Single and multiple procedure success rates were 20.1% and 55.9%, respectively (80% off antiarrhythmic drug). Antiarrhythmic drug-free multiple procedure success was 46%. Long-term recurrences (n=171) were paroxysmal AF in 48 patients (28%) and persistent AF/atrial tachycardia in 123 patients (72%). Multivariable recurrent event analysis revealed the following factors favoring arrhythmia recurrence: failure to terminate AF during index procedure (hazard ratio [HR], 1.279; 95% confidence interval [CI], 1.093-1.497; P = 0.002), number of procedures (HR, 1.154; 95% CI, 1.051-1.267; P = 0.003), female sex (HR, 1.263; 95% CI, 1.027-1.553; P = 0.027), and the presence of structural heart disease (HR, 1.236; 95% CI, 1.003-1.524; P = 0.047). AF termination was correlated with a higher rate of consecutive procedures because of atrial tachycardia recurrences (P = 0.003; HR, 1.71; 95% CI, 1.20-2.43). CONCLUSIONS Catheter ablation of persistent AF using the stepwise approach provides limited long-term freedom of arrhythmias often requiring multiple procedures. AF termination, the number of procedures, sex, and the presence of structural heart disease correlate with outcome success. AF termination is associated with consecutive atrial tachycardia procedures.
Resumo:
Objectives. This paper seeks to assess the effect on statistical power of regression model misspecification in a variety of situations. ^ Methods and results. The effect of misspecification in regression can be approximated by evaluating the correlation between the correct specification and the misspecification of the outcome variable (Harris 2010).In this paper, three misspecified models (linear, categorical and fractional polynomial) were considered. In the first section, the mathematical method of calculating the correlation between correct and misspecified models with simple mathematical forms was derived and demonstrated. In the second section, data from the National Health and Nutrition Examination Survey (NHANES 2007-2008) were used to examine such correlations. Our study shows that comparing to linear or categorical models, the fractional polynomial models, with the higher correlations, provided a better approximation of the true relationship, which was illustrated by LOESS regression. In the third section, we present the results of simulation studies that demonstrate overall misspecification in regression can produce marked decreases in power with small sample sizes. However, the categorical model had greatest power, ranging from 0.877 to 0.936 depending on sample size and outcome variable used. The power of fractional polynomial model was close to that of linear model, which ranged from 0.69 to 0.83, and appeared to be affected by the increased degrees of freedom of this model.^ Conclusion. Correlations between alternative model specifications can be used to provide a good approximation of the effect on statistical power of misspecification when the sample size is large. When model specifications have known simple mathematical forms, such correlations can be calculated mathematically. Actual public health data from NHANES 2007-2008 were used as examples to demonstrate the situations with unknown or complex correct model specification. Simulation of power for misspecified models confirmed the results based on correlation methods but also illustrated the effect of model degrees of freedom on power.^