890 resultados para Broken Promises
Resumo:
INTRODUCTION: The purpose of our study was to retrospectively evaluate the clinical and radiological results of subtrochanteric fractures treated with a long gamma nail (LGN). The LGN has been the implant of choice at our level-1 trauma center since 1992. MATERIALS AND METHODS: Over a period of 7 years, we have treated 90 consecutive patients with subtrochanteric fractures. In order to evaluate the clinical and radiological outcomes, we reviewed the clinical and radiographic charts of these patients followed for a mean time of 2 years (range 13-36 months). RESULTS: We found no intra- or perioperative complications nor early or late infection. Clinical and radiological union was achieved at a mean of 4.3 months in all of the patients (range 3-9 months); in 24 cases (30%) the distal locking bolts were retrieved in order to enhance callus formation and remodeling as a planned secondary surgery. Three patients (3.3%) needed unplanned secondary surgery for problems related to the nailing technique. Two mechanical failures with breakage of the nail were encountered due to proximal varus malalignment, of which one was treated with exchange nailing and grafting and the other one by removal of the broken hardware, blade-plating, and bone grafting. One fracture below a short LGN was treated by exchange nailing. CONCLUSIONS: The minimally invasive technique and simple application of the LGN lead to a low percentage of complications in these difficult fractures after a relatively short learning curve. The biomechanical properties of this implant allow early mobilization and partial weight-bearing even in patients with advanced osteoporosis.
Resumo:
Equality with men in the world of paid work has been a major feminist objective. Given that work in the `public' sphere has historically been shaped on the assumption that the `worker' will be male, then national employment systems which facilitate masculine employment patterns (i.e. full-time work and unbroken employment careers) might be expected to be more likely to generate gender equality. This paper compares women's employment in France (where `masculine' careers for women are common) and Britain (where part-time work and broken employment careers are more likely) at the macro, meso (occupational), and micro (individual) levels. The two occupations studied are finance and pharmacy. The evidence presented suggests that there are considerable similarities between women in the two countries at the occupational and individual level, despite national variations. In the light of this evidence, structural and individual explanations of women's employment behaviour are examined, and the continuing significance of structural constraint on the patterning of gender relations is emphasised.
Resumo:
Roughly fifteen years ago, the Church of Jesus Christ of Latter-day Saints published a new proposed standard file format. They call it GEDCOM. It was designed to allow different genealogy programs to exchange data.Five years later, in may 2000, appeared the GENTECH Data Modeling Project, with the support of the Federation of Genealogical Societies (FGS) and other American genealogical societies. They attempted to define a genealogical logic data model to facilitate data exchange between different genealogical programs. Although genealogists deal with an enormous variety of data sources, one of the central concepts of this data model was that all genealogical data could be broken down into a series of short, formal genealogical statements. It was something more versatile than only export/import data records on a predefined fields. This project was finally absorbed in 2004 by the National Genealogical Society (NGS).Despite being a genealogical reference in many applications, these models have serious drawbacks to adapt to different cultural and social environments. At the present time we have no formal proposal for a recognized standard to represent the family domain.Here we propose an alternative conceptual model, largely inherited from aforementioned models. The design is intended to overcome their limitations. However, its major innovation lies in applying the ontological paradigm when modeling statements and entities.
Resumo:
The first decade of the twenty-first century may be remembered for the rebirth of consensus on labour market policy. After three decades of bitter political and ideological controversy between a neo-liberal and a traditional social democratic approach, a new model, often labelled flexicurity, has emerged. This model is promoted by numerous political organisations since it promises to put an end to the old trade-off between equality and efficiency. Several countries are embracing the flexicurity model as a blueprint for labour market reform, but others, mostly belonging to the 'Mediterranean Rim', are clearly lagging behind. Why is it so difficult for these countries to implement the flexicurity model? This paper argues that the application of a flexicurity strategy in these countries is complicated by the lack of social trust between social partners and the state as well as political economy traditions that highlight the role of labour market regulation as a source of social protection.
Resumo:
L'émergence des nouvelles technologies de la reproduction (NTR) est allée de pair avec un certain nombre de discours. Un discours promettant d'une part une extension de la palette de choix reproductifs des individus, une extension de leur liberté et de leur autonomie reproductives, dont la forme la plus extrême peut se traduire par la formule : un enfant quand je veux et comme je veux. D'autre part, un discours annonçant une série de « catastrophes » à venir, telles que l'effondrement de l'institution de la famille et la modification de l'espèce humaine. En d'autres termes, une tension entre promesses et catastrophes qui place les sociétés contemporaines face à de nombreux défis sociaux, politiques et éthiques, notamment quant à la question de la régulation de la PMA (procréation médicalement assistée) : qui peut y avoir accès ? Quelles techniques doit-on autoriser ? Ou au contraire limiter ? Tant de questions auxquelles aucune réponse simple et évidente n'existe. La diversité des réponses législatives quant à ces questions illustre cette complexité. L'éthique peut, ici, jouer un rôle fondamental. Sans toutefois prétendre donner des réponses toutes faites et facilement applicables, elle offre un espace de réflexion, le privilège de prendre une certaine distance face à des enjeux contemporains. C'est dans cette perspective que nous avons ancré ce travail de recherche en questionnant les enjeux éthiques de la PMA à partir d'une perspective de justice. Toutefois, au sein des études en bioéthique, majoritairement issues de la tradition libérale, la tension énoncée précédemment mène la bioéthique à justifier un certain nombre d'inégalités plutôt que de veiller à les dépasser. Ainsi, une évaluation de la pratique de la PMA à partir d'une perspective de la justice, exige, au préalable, une réévaluation du concept même de justice. Ce faisant, par une articulation entre l'éthique du care de Joan Tronto et l'approche des capabilités de Martha Nussbaum qui placent la vulnérabilité au coeur de la personne, nous avons proposé une conception de la justice fondée sur une anthropologie de la vulnérabilité. Cette conception nous permet d'identifier, dans le cadre de la pratique de la PMA en Suisse et en partant de la loi sur la procréation assistée (LPMA), les constructions normatives qui mènent à la non-reconnaissance et, ce faisant, à la mise à l'écart, de certaines formes de vulnérabilité : une vulnérabilité générique et une vulnérabilité socio-économique. Traitant la question de la vulnérabilité générique principalement, nos analyses ont une incidence sur les conceptions de la famille, du bien de l'enfant, de la femme et de la nature, telles qu'elles sont actuellement véhiculées par une conception naturalisée de la PMA. Répondre aux vulnérabilités identifiées, en veillant à leur donner une place, signifie alors déplacer ces conceptions naturalisées, afin que les vulnérabilités soient intégrées aux pratiques sociales et que les exigences de justice soient ainsi remplies. - The emergence of assisted reproductive technologies (ART) came along with several discourses. On the one hand a discourse promising an extension of the individuals' reproductive choices, their procreative liberty and autonomy. On the other hand a discourse announced a series of disasters to come such as the collapse of the family institution and the modification of human kind. In other words, a growing tension appears between promises and disasters and contemporary societies are facing inevitable social, political and ethical challenges, in particular with regard to the issue of ART regulation: who has access? What procedures should be authorized? Which ones should be limited? These complex questions have no simple or obvious answers. The variety of legislative responses to these questions highlights complexity. Ethics can play a fundamental role, and without claiming to give simple answers, also offer a space for reflection as well as the privilege to distance itself with regard to contemporary issues. It is in this perspective that this study questions the ethical considerations of ART in a perspective of justice. However, in previous studies in bioethics mainly following a liberal tradition, previously mentioned tension has lead bioethics to justify some inequalities instead of trying to overcome them. As a consequence, evaluating practices of ART from a perspective of justice requires to first reevaluate the concept of justice itself. In doing so we offer a conception of justice founded on the anthropology of vulnerability. This conception draws on an articulation of the ethic of care of Joan Tronto and the capability approach of Martha Nussbaum, which places vulnerability at the center of the person. This conception allows us to identify, within the framework of ARTS in Switzerland and starting with the laws of medically assisted procreation (LPMA), some normative constructions. These constructions lead to the non-recognition and the disregard of some forms of vulnerability: a generic vulnerability as well as socio-economic counterpart. Focusing mainly on the issue of generic vulnerability, our analysis has implications for the conceptions of family, the best interests of the child, woman, and nature in the way they are defined in a naturalized conception of ART. Responding to such failures by taking into account these vulnerabilities thus means to move these conceptions in order for vulnerabilities to be integrated in social practices and requirements for justice to be fulfilled.
Resumo:
We study the damage enhanced creep rupture of disordered materials by means of a fiber bundle model. Broken fibers undergo a slow stress relaxation modeled by a Maxwell element whose stress exponent m can vary in a broad range. Under global load sharing we show that due to the strength disorder of fibers, the lifetime ʧ of the bundle has sample-to-sample fluctuations characterized by a log-normal distribution independent of the type of disorder. We determine the Monkman-Grant relation of the model and establish a relation between the rupture life tʄ and the characteristic time tm of the intermediate creep regime of the bundle where the minimum strain rate is reached, making possible reliable estimates of ʧ from short term measurements. Approaching macroscopic failure, the deformation rate has a finite time power law singularity whose exponent is a decreasing function of m. On the microlevel the distribution of waiting times is found to have a power law behavior with m-dependent exponents different below and above the critical load of the bundle. Approaching the critical load from above, the cutoff value of the distributions has a power law divergence whose exponent coincides with the stress exponent of Maxwell elements
Resumo:
This study examined the validity and reliability of the French version of two observer-rated measures developed to assess cognitive errors (cognitive errors rating system [CERS]) [6] and coping action patterns (coping action patterns rating system [CAPRS]) [22,24]. The CE measures 14 cognitive errors, broken down according to their valence positive or negative (see the definitions by A.T. Beck), and the CAP measures 12 coping categories, based on an comprehensive review literature, each broken down into three levels of action (affective, behavioural, cognitive). Thirty (N = 30) subjects recruited in a community sample participated in the study. They were interviewed according to a standardized clinical protocol: these interviews were transcribed and analysed with both observer-rated systems. Results showed that the inter-rater reliability of the two measures is good and that their internal validity is satisfactory, due to a non-significant canonical correlation between CAP and CE. With regard to discriminant validity, we found a non-significant canonical correlation between CAPRS and CISS, one of most widely used self-report questionnaire measuring coping. The same can be said for the correlation with a self-report questionnaire measuring symptoms (SCL-90-R). These results confirm the absence of confounds in the assessment of cognitive errors and of coping as assessed by these observer-rated scales and add an argument in favour of the French validation of the CE-CAP rating scales. (C) 2010 Elsevier Masson SAS. All rights reserved.
Resumo:
The completion of the sequencing of the mouse genome promises to help predict human genes with greater accuracy. While current ab initio gene prediction programs are remarkably sensitive (i.e., they predict at least a fragment of most genes), their specificity is often low, predicting a large number of false-positive genes in the human genome. Sequence conservation at the protein level with the mouse genome can help eliminate some of those false positives. Here we describe SGP2, a gene prediction program that combines ab initio gene prediction with TBLASTX searches between two genome sequences to provide both sensitive and specific gene predictions. The accuracy of SGP2 when used to predict genes by comparing the human and mouse genomes is assessed on a number of data sets, including single-gene data sets, the highly curated human chromosome 22 predictions, and entire genome predictions from ENSEMBL. Results indicate that SGP2 outperforms purely ab initio gene prediction methods. Results also indicate that SGP2 works about as well with 3x shotgun data as it does with fully assembled genomes. SGP2 provides a high enough specificity that its predictions can be experimentally verified at a reasonable cost. SGP2 was used to generate a complete set of gene predictions on both the human and mouse by comparing the genomes of these two species. Our results suggest that another few thousand human and mouse genes currently not in ENSEMBL are worth verifying experimentally.
Modern Vaccines/Adjuvants Formulation-Session 2 (Plenary II): May 15-17, 2013-Lausanne, Switzerland.
Resumo:
On the 15-17th May 2013, the Fourth International Conference on Modern Vaccines/Adjuvants Formulation was organized in Lausanne, Switzerland, and gathered stakeholders from academics and from the industry to discuss several challenges, advances and promises in the field of vaccine adjuvants. Plenary session 2 of the meeting was composed of four different presentations covering: (1) the recent set-up of an adjuvant technology transfer and training platform in Switzerland, (2) the proposition to revisit existing paradigms of modern vaccinology, (3) the properties of polyethyleneimine as potential new vaccine adjuvant, and (4) the progresses in the design of HIV vaccine candidates able to induce broadly neutralizing antibodies.
Resumo:
In the past century, public health has been credited with adding 25 years to life expectancy by contributing to the decline in illness and injury. Progress has been made, for example, in smoking reduction, infectious disease, and motor vehicle and workplace injuries. Besides its focus on traditional concerns such as clean water and safe food, public health is adapting to meet emerging health problems. Particular troublesome are health threats to youth: teenage pregnancies, violence, substance abuse, sexually transmitted diseases, and other conditions associated with high-risk behaviors. These threats add to burgeoning health care costs. A conservative estimate of $69 billion in medical spending could be averted through the impact of public health strategies aimed at heart disease, stroke, fatal and nonfatal occupational injuries, motor vehicle-related injuries, low birth weight, and violence. These strategies require the collaboration of many groups in the public and private sectors. Collaboration is the bedrock of public health and Healthy Iowans planning. At the core of Healthy Iowans 2000 and its successor, Healthy Iowans 2010, is the idea that all Iowans benefit when stakeholders decide on disease prevention and health promotion strategies and agree to work together on them. These strategies can improve the quality of life and hold down health care costs. The payoff for health promotion and disease prevention is not immediate, but it has long-lasting benefits. The Iowa plan is a companion to the national plan, Healthy People 2010. An initiative to improve the health of Americans, the national plan is the driving force for federal resource allocation for disease prevention and health promotion. The state plan is used in the same way. Both plans have received broad support from Republican and Democratic administrations. Community planners are using the state plan to help assess health needs and craft health improvement plans. Healthy Iowans 2010 was written at an unusual point in history – a new decade, a new century, a new millennium. The introduction was optimistic. “The 21st century,” it says, “promises to add life as well as years through improved health habits coupled with medical advances. Scientists have suggested that if these changes occur, the definition of adulthood will also change. An extraordinary number of people will live fuller, more active lives beyond that expected in the late 20th century.” At the same time, the country has spawned a new generation of health hazards. According to Dr. William Dietz of the Centers for Disease Control and Prevention (CDC), it has replaced “the diseases of deficiency with diseases of excess” (Newsweek, August 2, 1999). New threats, such as childhood overweight, can reverse progress made in the last century. This demands concerted action.
Resumo:
Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.
Resumo:
Zeta potential is a physico-chemical parameter of particular importance to describe sorption of contaminants at the surface of gas bubbles. Nevertheless, the interpretation of electrophoretic mobilities of gas bubbles is complex. This is due to the specific behavior of the gas at interface and to the excess of electrical charge at interface, which is responsible for surface conductivity. We developed a surface complexation model based on the presence of negative surface sites because the balance of accepting and donating hydrogen bonds is broken at interface. By considering protons adsorbed on these sites followed by a diffuse layer, the electrical potential at the head-end of the diffuse layer is computed and considered to be equal to the zeta potential. The predicted zeta potential values are in very good agreement with the experimental data of H-2 bubbles for a broad range of pH and NaCl concentrations. This implies that the shear plane is located at the head-end of the diffuse layer, contradicting the assumption of the presence of a stagnant diffuse layer at the gas/water interface. Our model also successfully predicts the surface tension of air bubbles in a KCl solution. (c) 2012 Elsevier Inc. All rights reserved.
Resumo:
This thesis is about young people's views and mental images of the Finnish National Theatre (FNT). Research was needed into the views of young people about the FNT. The results would also be useful for FNT's marketing operations. The research was executed as a qualitative interview (based on a structured questionnaire) among 16 young people in January and February 2006. Four different high schools were involved: Ressun Lukio, Mäkelänrinteen Lukio, Kallion Lukio and Helsingin Kaupungin Kuvataidelukio (which together represented art-oriented high schools) and Vuosaaren Lukio. From each high school two boys and two girls were selected for interview. FNT's marketing director was also interviewed. The questions for the interview were formulated in co-operation with the FNT. The structural questionnaire was basically broken down into three different sections. The first section concentrated on the interviewee's hobbies and his/her past and present relationship to theatre. The second section consisted of questions about his/her views on the FNT. There were different sets of questions regarding whether or not the interviewee had visited the FNT. The last section was about the interviewee's use of media. This thesis focuses on the second section of questions. The answers revealed that, out of the group of sixteen, all but one had visited the FNT. Most of them saw the FNT as a traditional and valuable institution which is easier to approach than Finnish National Opera. Eleven of the interviewees reported that the main reason for their visit was a school project, and that without it they probably wouldn't have gone to the FNT at all. The thesis contemplates co-operation between the FNT and schools, and the meaning of art education for children's and young people's positive cultural development.
Resumo:
This project focuses on studying and testing the benefits of the NX Remote Desktop technology in administrative use for Finnish Meteorological Institutes existing Linux Terminal Service Project environment. This was done due to the criticality of the system caused by growing number of users as the Linux Terminal Service Project system expands. Although many of the supporting tasks can be done via Secure Shell connection, testing graphical programs or desktop behaviour in such a way is impossible. At first basic technologies behind the NX Remote Desktop were studied, and after that started the testing of two possible programs, FreeNX and NoMachine NX server. Testing the functionality and bandwidth demands were first done in a closed local area network, and results were studied. The better candidate was then installed in a virtual server simulating actual Linux Terminal Service Project server at Finnish Meteorological Institute and connection from Internet was tested to see was there any problems with firewalls and security policies. The results are reported in this study. Studying and testing the two different candidates of NX Remote Desktop showed, that NoMachine NX Server provides better customer support and documentation. Security aspects of the Finnish Meteorological Institute had also to be considered, and since updates along with the new developing tools are announced in next version of the program, this version was the choice. Studies also show that even NoMachine promises a swift connection over an average of 20Kbit/s bandwidth, at least double of that is needed. This project gives an overview of available remote desktop products along their benefits. NX Remote Desktop technology is studied, and installation instructions are included. Testing is done in both, closed and the actual environment and problems and suggestions are studied and analyzed. The installation to the actual LTSP server is not yet made, but a virtual server is put up in the same place in the view of network topology. This ensures, that if the administrators are satisfied with the system, installation and setting up the system will go as described in this report.
Resumo:
The properties and cosmological importance of a class of non-topological solitons, Q-balls, are studied. Aspects of Q-ball solutions and Q-ball cosmology discussed in the literature are reviewed. Q-balls are particularly considered in the Minimal Supersymmetric Standard Model with supersymmetry broken by a hidden sector mechanism mediated by either gravity or gauge interactions. Q-ball profiles, charge-energy relations and evaporation rates for realistic Q-ball profiles are calculated for general polynomial potentials and for the gravity mediated scenario. In all of the cases, the evaporation rates are found to increase with decreasing charge. Q-ball collisions are studied by numerical means in the two supersymmetry breaking scenarios. It is noted that the collision processes can be divided into three types: fusion, charge transfer and elastic scattering. Cross-sections are calculated for the different types of processes in the different scenarios. The formation of Q-balls from the fragmentation of the Aflieck-Dine -condensate is studied by numerical and analytical means. The charge distribution is found to depend strongly on the initial energy-charge ratio of the condensate. The final state is typically noted to consist of Q- and anti-Q-balls in a state of maximum entropy. By studying the relaxation of excited Q-balls the rate at which excess energy can be emitted is calculated in the gravity mediated scenario. The Q-ball is also found to withstand excess energy well without significant charge loss. The possible cosmological consequences of these Q-ball properties are discussed.