120 resultados para coded character set
Resumo:
his paper uses fuzzy-set ideal type analysis to assess the conformity of European leave regulations to four theoretical ideal typical divisions of labour: male breadwinner, caregiver parity, universal breadwinner and universal caregiver. In contrast to the majority of previous studies, the focus of this analysis is on the extent to which leave regulations promote gender equality in the family and the transformation of traditional gender roles. The results of this analysis demonstrate that European countries cluster into five models that only partly coincide with countries’ geographical proximity. Second, none of the countries considered constitutes a universal caregiver model, while the male breadwinner ideal continues to provide the normative reference point for parental leave regulations in a large number of European states. Finally, we witness a growing emphasis at the national and EU levels concerning the universal breadwinner ideal, which leaves gender inequality in unpaid work unproblematized.
After the Male Breadwinner Model? Childcare Services and the Division of Labor in European Countries
Resumo:
Fundamental reforms in childcare services appear to have eroded traditional
support to the male breadwinner model across European states. There has been a strong debate about the direction of these changes, and the ways in which childcare services can alter the division of labor and promote gender equality. This paper deals with these issues by using fuzzy set ideal-type analysis to assess the conformity of childcare service provisions in European economies to Fraser’s four ideal typical models: male breadwinner, caregiver parity, universal breadwinner, and universal caregiver. We find that there is resilience of traditional gender roles in the majority of European countries, while there are different variants of the universal breadwinner shaping different forms of childcare policies. The more equalitarian universal caregiver model maintains its utopian character.
Resumo:
Summary: Social work is a discipline that focuses on the person-in-the-environment. However, the social domains of influence have traditionally received more attention from the profession compared with the impact of the natural world on human well-being. With the development of ecological theories, and growing threats to the environment, this gap has been addressed and now the notion of eco-social work is attracting more interest. This article builds on this corpus of work by exploring, and augmenting, the thinking of the philosopher, David Abram, and his phenomenological investigation of perception, meaning, embodiment, language and Indigenous experience. The implications for eco-social work are then addressed.
Findings: The development of Abram’s philosophical thesis is charted by reviewing his presentation of the ideas of the European phenomenologists, Edmund Husserl and Maurice Merleau-Ponty. It is argued that Abram uses phenomenology to explore the character of perception and the sensual foundations of language which, in Indigenous cultures, are connected with the natural world. A gap in Abram’s thinking is then revealed showing the need to set human perception and language within an understanding of power. Overall, this re-worked thesis is underpinned by a meta-narrative in which ecology engages with philosophy, psychology and Indigenous experience.
Applications: By grounding such ideas in Slavoj Žižek’s construct of the sensuous event, three applications within social work are evinced, namely: (i) reflecting on the sensuous event in social work education; (ii) rekindling the sensuous event with Indigenous Peoples; and (iii) instigating the sensuous event with non-Indigenous populations.
Resumo:
We examine the representation of judgements of stochastic independence in probabilistic logics. We focus on a relational logic where (i) judgements of stochastic independence are encoded by directed acyclic graphs, and (ii) probabilistic assessments are flexible in the sense that they are not required to specify a single probability measure. We discuss issues of knowledge representation and inference that arise from our particular combination of graphs, stochastic independence, logical formulas and probabilistic assessments.
Resumo:
The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.
Resumo:
Background: The COMET (Core Outcome Measures in Effectiveness Trials) Initiative is developing a publicly accessible online resource to collate the knowledge base for core outcome set development (COS) and the applied work from different health conditions. Ensuring that the database is as comprehensive as possible and keeping it up to date are key to its value for users. This requires the development and application of an optimal, multi-faceted search strategy to identify relevant material. This paper describes the challenges of designing and implementing such a search, outlining the development of the search strategy for studies of COS development, and, in turn, the process for establishing a database of COS.
Methods: We investigated the performance characteristics of this strategy including sensitivity, precision and numbers needed to read. We compared the contribution of databases towards identifying included studies to identify the best combination of methods to retrieve all included studies.
Results: Recall of the search strategies ranged from 4% to 87%, and precision from 0.77% to 1.13%. MEDLINE performed best in terms of recall, retrieving 216 (87%) of the 250 included records, followed by Scopus (44%). The Cochrane Methodology Register found just 4% of the included records. MEDLINE was also the database with the highest precision. The number needed to read varied between 89 (MEDLINE) and 130 (SCOPUS).
Conclusions: We found that two databases and hand searching were required to locate all of the studies in this review. MEDLINE alone retrieved 87% of the included studies, but actually 97% of the included studies were indexed on MEDLINE. The Cochrane Methodology Register did not contribute any records that were not found in the other databases, and will not be included in our future searches to identify studies developing COS. SCOPUS had the lowest precision rate (0.77) and highest number needed to read (130). In future COMET searches for COS a balance needs to be struck between the work involved in screening large numbers of records, the frequency of the searching and the likelihood that eligible studies will be identified by means other than the database searches.
Resumo:
Background
Among clinical trials of interventions that aim to modify time spent on mechanical ventilation for critically ill patients there is considerable inconsistency in chosen outcomes and how they are measured. The Core Outcomes in Ventilation Trials (COVenT) study aims to develop a set of core outcomes for use in future ventilation trials in mechanically ventilated adults and children.
Methods/design
We will use a mixed methods approach that incorporates a randomised trial nested within a Delphi study and a consensus meeting. Additionally, we will conduct an observational cohort study to evaluate uptake of the core outcome set in published studies at 5 and 10 years following core outcome set publication. The three-round online Delphi study will use a list of outcomes that have been reported previously in a review of ventilation trials. The Delphi panel will include a range of stakeholder groups including patient support groups. The panel will be randomised to one of three feedback methods to assess the impact of the feedback mechanism on subsequent ranking of outcomes. A final consensus meeting will be held with stakeholder representatives to review outcomes.
Discussion
The COVenT study aims to develop a core outcome set for ventilation trials in critical care, explore the best Delphi feedback mechanism for achieving consensus and determine if participation increases use of the core outcome set in the long term.
Resumo:
We have recorded a new corpus of emotionally coloured conversations. Users were recorded while holding conversations with an operator who adopts in sequence four roles designed to evoke emotional reactions. The operator and the user are seated in separate rooms; they see each other through teleprompter screens, and hear each other through speakers. To allow high quality recording, they are recorded by five high-resolution, high framerate cameras, and by four microphones. All sensor information is recorded synchronously, with an accuracy of 25 μs. In total, we have recorded 20 participants, for a total of 100 character conversational and 50 non-conversational recordings of approximately 5 minutes each. All recorded conversations have been fully transcribed and annotated for five affective dimensions and partially annotated for 27 other dimensions. The corpus has been made available to the scientific community through a web-accessible database.
Resumo:
A new approach to determine the local boundary of voltage stability region in a cut-set power space (CVSR) is presented. Power flow tracing is first used to determine the generator-load pair most sensitive to each branch in the interface. The generator-load pairs are then used to realize accurate small disturbances by controlling the branch power flow in increasing and decreasing directions to obtain new equilibrium points around the initial equilibrium point. And, continuous power flow is used starting from such new points to get the corresponding critical points around the initial critical point on the CVSR boundary. Then a hyperplane cross the initial critical point can be calculated by solving a set of linear algebraic equations. Finally, the presented method is validated by some systems, including New England 39-bus system, IEEE 118-bus system, and EPRI-1000 bus system. It can be revealed that the method is computationally more efficient and has less approximation error. It provides a useful approach for power system online voltage stability monitoring and assessment. This work is supported by National Natural Science Foundation of China (No. 50707019), Special Fund of the National Basic Research Program of China (No. 2009CB219701), Foundation for the Author of National Excellent Doctoral Dissertation of PR China (No. 200439), Tianjin Municipal Science and Technology Development Program (No. 09JCZDJC25000), National Major Project of Scientific and Technical Supporting Programs of China During the 11th Five-year Plan Period (No. 2006BAJ03A06). ©2009 State Grid Electric Power Research Institute Press.
Resumo:
Cells experience damage from exogenous and endogenous sources that endanger genome stability. Several cellular pathways have evolved to detect DNA damage and mediate its repair. Although many proteins have been implicated in these processes, only recent studies have revealed how they operate in the context of high-ordered chromatin structure. Here, we identify the nuclear oncogene SET (I2PP2A) as a modulator of DNA damage response (DDR) and repair in chromatin surrounding double-strand breaks (DSBs). We demonstrate that depletion of SET increases DDR and survival in the presence of radiomimetic drugs, while overexpression of SET impairs DDR and homologous recombination (HR)-mediated DNA repair. SET interacts with the Kruppel-associated box (KRAB)-associated co-repressor KAP1, and its overexpression results in the sustained retention of KAP1 and Heterochromatin protein 1 (HP1) on chromatin. Our results are consistent with a model in which SET-mediated chromatin compaction triggers an inhibition of DNA end resection and HR.
Resumo:
Purpose: Amorphous drug-polymer solid dispersions have been found to result in improved drug dissolution rates when compared to their crystalline counterparts. However, when the drug exists in the amorphous form it will possess a higher Gibb’s free energy than its associated crystalline state and can recrystallize. Drug-polymer phase diagrams constructed through the application of the Flory Huggins (F-H) theory contain a wealth of information regarding thermodynamic and kinetic stability of the amorphous drug-polymer system. This study was aimed to evaluate the effects of various experimental conditions on the solubility and miscibility detections of drug-polymer binary system. Methods: Felodipine (FD)-Polyvinylpyrrolidone (PVP) K15 (PVPK15) and FD-Polyvinylpyrrolidone/vinyl acetate (PVP/VA64) were the selected systems for this research. Physical mixtures with different drug loadings were mixed and ball milled. These samples were then processed using Differential Scanning Calorimetry (DSC) and measurements of melting point (Tend) and glass transition (Tg) were detected using heating rates of 0.5, 1.0 and 5.0°C/min. Results: The melting point depression data was then used to calculate the F-H interaction parameter (χ) and extrapolated to lower temperatures to complete the liquid–solid transition curves. The theoretical binodal and spinodal curves were also constructed which were used to identify regions within the phase diagram. The effects of polymer selection, DSC heating rate, time above parent polymer Tg and polymer molecular weight were investigated by identifying amorphous drug miscibility limits at pharmaceutically relevant temperatures. Conclusion: The potential implications of these findings when applied to a non-ambient processing method such as Hot Melt Extrusion (HME) are also discussed.
Resumo:
BACKGROUND: A core outcome set (COS) can address problems of outcome heterogeneity and outcome reporting bias in trials and systematic reviews, including Cochrane reviews, helping to reduce waste. One of the aims of the international Core Outcome Measures in Effectiveness Trials (COMET) Initiative is to link the development and use of COS with the outcomes specified and reported in Cochrane reviews, including the outcomes listed in the summary of findings (SoF) tables. As part of this work, an earlier exploratory survey of the outcomes of newly published 2007 and 2011 Cochrane reviews was performed. This survey examined the use of COS, the variety of specified outcomes, and outcome reporting in Cochrane reviews by Cochrane Review Group (CRG). To examine changes over time and to explore outcomes that were repeatedly specified over time in Cochrane reviews by CRG, we conducted a follow-up survey of outcomes in 2013 Cochrane reviews.
METHODS: A descriptive survey of outcomes in Cochrane reviews that were first published in 2013. Outcomes specified in the methods sections and reported in the results section of the Cochrane reviews were examined by CRG. We also explored the uptake of SoF tables, the number of outcomes included in these, and the quality of the evidence for the outcomes.
RESULTS: Across the 50 CRGs, 375 Cochrane reviews that included at least one study specified a total of 3142 outcomes. Of these outcomes, 32 % (1008) were not reported in the results section of these reviews. For 23 % (233) of these non-reported outcomes, we did not find any reason in the text of the review for this non-report. Fifty-seven percent (216/375) of reviews included a SoF table.
CONCLUSIONS: The proportion of specified outcomes that were reported in Cochrane reviews had increased in 2013 (68 %) compared to 2007 (61 %) and 2011 (65 %). Importantly, 2013 Cochrane reviews that did not report specified outcomes were twice as likely to provide an explanation for why the outcome was not reported. There has been an increased uptake of SoF tables in Cochrane reviews. Outcomes that were repeatedly specified in Cochrane reviews by CRG in 2007, 2011, and 2013 may assist COS development.
Resumo:
his paper considers a problem of identification for a high dimensional nonlinear non-parametric system when only a limited data set is available. The algorithms are proposed for this purpose which exploit the relationship between the input variables and the output and further the inter-dependence of input variables so that the importance of the input variables can be established. A key to these algorithms is the non-parametric two stage input selection algorithm.