879 resultados para Equal pay for equal work


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las empresas de economía social comparten unos valores que motivan un comportamiento diferente de éstas en relación con la composición de sus plantillas, las condiciones de trabajo, su especialización productiva y su ubicación geográfica frente a las empresas ordinarias (sociedades anónimas y laborales) que no son de economía social. Este comportamiento diferencial constituye, a su vez, una importante aportación a la cohesión social y, de forma específica, desde el punto de vista de género, proporciona una mejora de la presencia y posición de las mujeres en el ámbito laboral. El objetivo principal de esta investigación es evaluar la existencia de elementos diferenciales entre las empresas de economía social y las ordinarias en cuanto a la igualdad de oportunidades, condiciones y trayectorias laborales desde una perspectiva de género, centrando el estudio en el caso de España. A partir de la Muestra Continua de Vidas Laborales (2010) se identifican dos grupos de empresas, de economía social (grupo objetivo) y de economía "no social" u ordinaria (grupo de control) equivalentes en cuanto a tamaño y sector de actividad. Para cada grupo y sus respectivos/as trabajadores/as, se realizan contrastes paramétricos y no paramétricos de diferencias de medias entre mujeres y hombres en relación a diferentes características laborales como el tipo de jornada, la duración del contrato o la estabilidad en la trayectoria laboral en la empresa. Además, se lleva a cabo una estimación de la discriminación salarial en ambos grupos siguiendo el modelo Oaxaca-Blinder. Los resultados muestran que las empresas de economía social ofrecen mejores condiciones en el acceso y la permanencia en el puesto laboral a las mujeres, una mayor estabilidad laboral y menor discriminación salarial frente a los hombres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the socio-economic evolution of the social economy sector in the Basque Country during the 2008-2014 period of economic crisis. Data have been obtained within a framework of collaboration between university, Basque Government and private sector of the social economy. The results suggest that such entities have evolved better, both in terms of number of enterprises and employment, than the general economy of the Basque Country, while the context of public policies aimed at social economy has worsened over the years. However, in economic terms (measured through the Gross Value Added generated), they have not been able to cope with the crisis in equal conditions to the general economy. The main contribution of this research lies in that, unlike similar studies, it discusses the evolution of the whole sector of the social economy, taking as reference a broad period of the current economic crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, an analysis of radio channel characteristics for single- and multiple-antenna bodyworn systems for use in body-to-body communications is presented. The work was based on an extensive measurement campaign conducted at 2.45 GHz representative of an indoor sweep and search scenario for fire and rescue personnel. Using maximum-likelihood estimation in conjunction with the Akaike information criterion (AIC), five candidate probability distributions were investigated and from these the kappa - mu distribution was found to best describe small-scale fading observed in the body-to-body channels. Additional channel parameters such as autocorrelation and the cross-correlation coefficient between fading signal envelopes were also analyzed. Low cross correlation and small differences in mean signal levels between potential dual-branch diversity receivers suggested that the prospect of successfully implementing diversity in this type application is extremely good. Moreover, using selection combination, maximal ratio, and equal gain combining, up to 8.69-dB diversity gain can be made available when four spatially separated antennas are used at the receiver. Additional improvements in the combined envelopes through lower level crossing rates and fade durations at low signal levels were also observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need to provide rapid, sensitive, and often high throughput detection of pathogens in diagnostic virology. Viral gastroenteritis is a serious health issue often leading to hospitalization in the young, the immunocompromised and the elderly. The common causes of viral gastroenteritis include rotavirus, norovirus (genogroups I and II), astrovirus, and group F adenoviruses (serotypes 40 and 41). This article describes the work-up of two internally controlled multiplex, probe-based PCR assays and reports on the clinical validation over a 3-year period, March 2007 to February 2010. Multiplex assays were developed using a combination of TaqMan™ and minor groove binder (MGB™) hydrolysis probes. The assays were validated using a panel of 137 specimens, previously positive via a nested gel-based assay. The assays had improved sensitivity for adenovirus, rotavirus, and norovirus (97.3% vs. 86.1%, 100% vs. 87.8%, and 95.1% vs. 79.5%, respectively) and also more specific for targets adenovirus, rotavirus, and norovirus (99% vs. 95.2%, 100% vs. 93.6%, and 97.9% vs. 92.3%, respectively). For the specimens tested, both assays had equal sensitivity and specificity for astrovirus (100%). Overall the probe-based assays detected 16 more positive specimens than the nested gel-based assay. Post-introduction to the routine diagnostic service, a total of 9,846 specimens were processed with multiplex 1 and 2 (7,053 pediatric, 2,793 adult) over the 3-year study period. This clinically validated, probe-based multiplex testing algorithm allows highly sensitive and timely diagnosis of the four most prominent causes of viral gastroenteritis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based upon the original application to the European Commission, this article gives insights into the thinking of the Euroidentities team at the point that the project began. The question: “Is the European ‘identity project’ failing?” is posed in the sense that the political and economic attainments of the European Union have not been translated into a sense of identity with or commitment to Europe from the populaces that have benefited from them. The urgency of European ‘identity work’ is asserted with a number of levels for the construction of European identity being hypothesized. Euroidentities is intended to break conceptual ground by bringing together on an equal footing two apparently antagonistic views of identity -- the collective and institutional and the individual and biographical – to give a more anchored and nuanced view of identity formation and transformation than either can provide on its own. Rather than following the dominant approaches to research on European identity that have been macro-theoretical and ‘top-down’, retrospective in-depth qualitative biographical interviews are planned since they provide the ideal means of gaining insight into the formation of a European identity or multiple identities from the ‘bottom up’ perspective of non-elite groups. The reliability of analysis will be buttressed by the use of contrastive comparison between cases, culminating in contrastive comparison across the national project teams between cases drawn from different ‘sensitized groups’ that provide the fieldwork structure of the project. The paper concludes with a summary of some of the more significant findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A linear array of n calcite crystals is shown to allow the generation of a high contrast (> 10: 1) train of 2(n) high energy (> 100 mu J) pulses from a single ultrafast laser pulse. Advantage is taken of the pulse-splitting properties of a single birefringent crystal, where an incident laser pulse can be split into two pulses with orthogonal polarizations and equal intensity, separated temporally in proportion to the thickness of the crystal traversed and the difference in refractive indices of the two optic axes. In the work presented here an array of seven calcite crystals of sequentially doubled thickness is used to produce a train of 128 pulses, each of femtosecond duration. Readily versatile properties such as the number of pulses in the train and variable mark-space ratio are realized from such a setup. (c) 2007 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: We examined the effects of leaving public sector general practitioner (GP) work and of taking a GP position on changes in work-related psychosocial factors, such as time pressure, patient-related stress, distress and work interference with family. In addition, we examined whether changes in time pressure and patient-related stress mediated the association of employment change with changes of distress and work interference with family. METHODS: Participants were 1705 Finnish physicians (60% women) who responded to surveys in 2006 and 2010. Analyses of covariance were conducted to examine the effect of employment change to outcome changes adjusted for gender, age and response format. Mediational effects were tested following the procedures outlined by Baron and Kenny. RESULTS: Employment change was significantly associated with all the outcomes. Leaving public sector GP work was associated with substantially decreased time pressure, patient-related stress, distress and work interference with family. In contrast, taking a position as a public sector GP was associated with an increase in these factors. Mediation tests suggested that the associations of employment change with distress change and work interference with family change were partially explained by the changes in time pressure and patient-related stress. CONCLUSIONS: Our results showed that leaving public sector GP work is associated with favourable outcomes, whereas taking a GP position in the public sector is associated with adverse effects. Primary health-care organizations should pay more attention to the working conditions of their GPs, in particular, to time pressure and patient-related stress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent efforts in the finite element modelling of delamination have concentrated on the development of cohesive interface elements. These are characterised by a bilinear constitutive law, where there is an initial high positive stiffness until a threshold stress level is reached, followed by a negative tangent stiffness representing softening (or damage evolution). Complete decohesion occurs when the amount of work done per unit area of crack surface is equal to a critical strain energy release rate. It is difficult to achieve a stable, oscillation-free solution beyond the onset of damage, using standard implicit quasi-static methods, unless a very refined mesh is used. In the present paper, a new solution strategy is proposed based on a pseudo-transient formulation and demonstrated through the modelling of a double cantilever beam undergoing Mode I delamination. A detailed analysis into the sensitivity of the user-defined parameters is also presented. Comparisons with other published solutions using a quasi-static formulation show that the pseudo-transient formulation gives improved accuracy and oscillation-free results with coarser meshes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissolved Air Flotation (DAF) is a well-known coagulation-flotation system applied at large scale for microalgae harvesting. Compared to conventional harvesting technologies DAF allows high cell recovery at lower energy demand. By replacing microbubbles with microspheres, the innovative Ballasted Dissolved Air Flotation (BDAF) technique has been reported to achieve the same algae cell removal efficiency, while saving up to 80% of the energy required for the conventional DAF unit. Using three different algae cultures (Scenedesmus obliquus, Chlorella vulgaris and Arthrospira maxima), the present work investigated the practical, economic and environmental advantages of the BDAF system compared to the DAF system. 99% cells separation was achieved with both systems, nevertheless, the BDAF technology allowed up to 95% coagulant reduction depending on the algae species and the pH conditions adopted. In terms of floc structure and strength, the inclusion of microspheres in the algae floc generated a looser aggregate, showing a more compact structure within single cell alga, than large and filamentous cells. Overall, BDAF appeared to be a more reliable and sustainable harvesting system than DAF, as it allowed equal cells recovery reducing energy inputs, coagulant demand and carbon emissions. © 2014 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EU non-discrimination law has seen a proliferation of discrimination grounds from 2000. Dis-crimination on grounds of gender (in the field of equal pay) and on grounds of nationality (generally within the scope of application of EU law) were the only prohibited forms of discrimination in EU law, until the Treaty of Amsterdam empowered the Community to legislate in order to combat discrimination on grounds of sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation (Article 13 EC). Proliferation of non-discrimination grounds is also characteristic for international and national non-discrimination law. As such, proliferation of grounds results in an increase in potential cases of “multiple discrimination” and the danger of diluting the demands of equality law by ever more multiplication of grounds. The hierarchy of equality, which has been so widely criticised in EU law, is a signifier of the latter danger.
This chapter proposes to structure the confusing field of non-discrimination grounds by organising them around nodes of discrimination fields. It will first reflect different ways of establishing hierarchies between grounds. This will be followed by a recount of different (narrow and wide) reading of grounds. A comprehensive reading of the grounds gender, ‘race’ and disability as establishing overlapping fields of discrimination grounds will be mapped out, with some examples for practical uses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines what is wrong with some expressive acts, ‘insults’. Their putative wrongfulness is distinguished from the causing of indirect harms, aggregated harms, contextual harms, and damaging misrepresentations. The article clarifies what insults are, making use of work by Neu and Austin, and argues that their wrongfulness cannot lie in the hurt that is caused to those at whom such acts are directed. Rather it must lie in what they seek to do, namely to denigrate the other. The causing of offence is at most evidence that an insult has been communicated; it is not independent grounds of proscription or constraint. The victim of an insult may know that she has been insulted but not accept or agree with the insult, and thereby submit to the insulter. Hence insults need not, as Waldron argues they do, occasion dignitary harms. They do not of themselves subvert their victims' equal moral status. The claim that hateful speech endorses inequality should not be conflated with a claim that such speech directly subverts equality.

Thus, ‘wounding words’ should not unduly trouble the liberal defender of free speech either on the grounds of preventing offence or on those of avoiding dignitary harms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Clinical studies have shown that more than 70% of primary bladder tumours arise in the area around the ureteric orifice. In this study a genomic approach was taken to explore the molecular mechanisms that may influence this phenomenon.

Methods: RNA was isolated from each individual normal ureteric orifice and the dome biopsy from 33 male patients. Equal amounts of the pooled ureteric orifice and dome mRNAs were labelled with Cy3 and Cy5, respectively before hybridising to the gene chip (UniGEM 2.0, Incyte Genomics Inc., Wilmington, Delaware, USA). Results: Significant changes (more than a twofold difference) in gene expression were observed in 3.1% (312) of the 10,176 gene array: 211 genes upregulated and 101 downregulated. Analysis of Cdc25B, TK1, PKM, and PDGFra with RT-PCR supported the reliability of the microarray result. Seladin-1 was the most upregulated gene in the ureteric orifice: 8.3-fold on the microarray and 11.4-fold by real time PCR.

Conclusions: Overall, this study suggests significant altered gene expression between these two anatomically distinct areas of the normal human bladder. Of particular note is Seladin-1, whose significance in cancer is yet to be clarified. Further studies of the genes discovered by this work will help clarify which of these differences influence primary bladder carcinogenesis. (c) 2006 European Association of Urology. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 15 years, the supernova community has endeavoured to directly identify progenitor stars for core-collapse supernovae discovered in nearby galaxies. These precursors are often visible as resolved stars in high-resolution images from space-and ground-based telescopes. The discovery rate of progenitor stars is limited by the local supernova rate and the availability and depth of archive images of galaxies, with 18 detections of precursor objects and 27 upper limits. This review compiles these results (from 1999 to 2013) in a distance-limited sample and discusses the implications of the findings. The vast majority of the detections of progenitor stars are of type II-P, II-L, or IIb with one type Ib progenitor system detected and many more upper limits for progenitors of Ibc supernovae (14 in all). The data for these 45 supernovae progenitors illustrate a remarkable deficit of high-luminosity stars above an apparent limit of log L/L-circle dot similar or equal to 5.1 dex. For a typical Salpeter initial mass function, one would expect to have found 13 high-luminosity and high-mass progenitors by now. There is, possibly, only one object in this time-and volume-limited sample that is unambiguously high-mass (the progenitor of SN2009ip) although the nature of that supernovae is still debated. The possible biases due to the influence of circumstellar dust, the luminosity analysis, and sample selection methods are reviewed. It does not appear likely that these can explain the missing high-mass progenitor stars. This review concludes that the community's work to date shows that the observed populations of supernovae in the local Universe are not, on the whole, produced by high-mass (M greater than or similar to 18 M-circle dot) stars. Theoretical explosions of model stars also predict that black hole formation and failed supernovae tend to occur above an initial mass of M similar or equal to 18 M-circle dot. The models also suggest there is no simple single mass division for neutron star or black-hole formation and that there are islands of explodability for stars in the 8-120 M-circle dot range. The observational constraints are quite consistent with the bulk of stars above M similar or equal to 18 M-circle dot collapsing to form black holes with no visible supernovae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have measured mass spectra for positive ions for low-energy electron impact on thymine using a reflectron time-of-flight mass spectrometer. Using computer controlled data acquisition, mass spectra have been acquired for electron impact energies up to 100 eV in steps of 0.5 eV. Ion yield curves for most of the fragment ions have been determined by fitting groups of adjacent peaks in the mass spectra with sequences of normalized Gaussians. The ion yield curves have been normalized by comparing the sum of the ion yields to the average of calculated total ionization cross sections. Appearance energies have been determined. The nearly equal appearance energies of 83 u and 55 u observed in the present work strongly indicate that near threshold the 55 u ion is formed directly by the breakage of two bonds in the ring, rather than from a successive loss of HNCO and CO from the parent ion. Likewise 54 u is not formed by CO loss from 82 u. The appearance energies are in a number of cases consistent with the loss of one or more hydrogen atoms from a heavier fragment, but 70 u is not formed by hydrogen loss from 71 u.