947 resultados para Composition of Human Capital
Resumo:
We examined the effects of cofactors and DNA on the stability, oligomeric state and conformation of the human mitochondrial DNA helicase. We demonstrate that low salt conditions result in protein aggregation that may cause dissociation of oligomeric structure. The low salt sensitivity of the mitochondrial DNA helicase is mitigated by the presence of magnesium, nucleotide, and increased temperature. Electron microscopic and glutaraldehyde cross-linking analyses provide the first evidence of a heptameric oligomer and its interconversion from a hexameric form. Limited proteolysis by trypsin shows that binding of nucleoside triphosphate produces a conformational change that is distinct from the conformation observed in the presence of nucleoside diphosphate. We find that single-stranded DNA binding occurs in the absence of cofactors and renders the mitochondrial DNA helicase more susceptible to proteolytic digestion. Our studies indicate that the human mitochondrial DNA helicase shares basic properties with the SF4 replicative helicases, but also identify common features with helicases outside the superfamily, including dynamic conformations similar to other AAA+ ATPases.
Resumo:
Korosteleva-Polglase, Elena, 'Can theories of social capital explain dissenting patterns of engagement in the new Europe?', Contemporary Politics, (2006) 12(2) pp.175-191 RAE2008
Resumo:
Background: Infection with multiple types of human papillomavirus (HPV) is one of the main risk factors associated with the development of cervical lesions. In this study, cervical samples collected from 1, 810 women with diverse sociocultural backgrounds, who attended to their cervical screening program in different geographical regions of Colombia, were examined for the presence of cervical lesions and HPV by Papanicolau testing and DNA PCR detection, respectively. Principal Findings: The negative binomial distribution model used in this study showed differences between the observed and expected values within some risk factor categories analyzed. Particularly in the case of single infection and coinfection with more than 4 HPV types, observed frequencies were smaller than expected, while the number of women infected with 2 to 4 viral types were higher than expected. Data analysis according to a negative binomial regression showed an increase in the risk of acquiring more HPV types in women who were of indigenous ethnicity (+37.8%), while this risk decreased in women who had given birth more than 4 times (-31.1%), or were of mestizo (-24.6%) or black (-40.9%) ethnicity. Conclusions: According to a theoretical probability distribution, the observed number of women having either a single infection or more than 4 viral types was smaller than expected, while for those infected with 2-4 HPV types it was larger than expected. Taking into account that this study showed a higher HPV coinfection rate in the indigenous ethnicity, the role of underlying factors should be assessed in detail in future studies.
Resumo:
The problem of the acquisition of first language phonology is dealt with within the general information-processing perspective. In this sense, language acquisition is viewed as a process of biologically founded pattern formation due to information exchanges between an adult and a child. Moreover, the process is cognitive in that the child, as a goal-seeking and error correcting individual, undertakes an intricate task of compressing a huge variety of linguistic stimuli in order to build an effective information code. It is further assumed that the basic mechanism which leads to the establishment of fully articulate linguistic ability is that of simulation. The mechanism works through a compression of a set of initial variables (i.e. initial conditions) into a minimum length algorithm and a subsequent construction of an integrated system of language-specific attractors. It is only then that the language user is capable of participating in an information transaction in a fully developed manner.
Resumo:
This technical report presents a combined solution for two problems, one: tracking objects in 3D space and estimating their trajectories and second: computing the similarity between previously estimated trajectories and clustering them using the similarities that we just computed. For the first part, trajectories are estimated using an EKF formulation that will provide the 3D trajectory up to a constant. To improve accuracy, when occlusions appear, multiple hypotheses are followed. For the second problem we compute the distances between trajectories using a similarity based on LCSS formulation. Similarities are computed between projections of trajectories on coordinate axes. Finally we group trajectories together based on previously computed distances, using a clustering algorithm. To check the validity of our approach, several experiments using real data were performed.
Resumo:
As new multi-party edge services are deployed on the Internet, application-layer protocols with complex communication models and event dependencies are increasingly being specified and adopted. To ensure that such protocols (and compositions thereof with existing protocols) do not result in undesirable behaviors (e.g., livelocks) there needs to be a methodology for the automated checking of the "safety" of these protocols. In this paper, we present ingredients of such a methodology. Specifically, we show how SPIN, a tool from the formal systems verification community, can be used to quickly identify problematic behaviors of application-layer protocols with non-trivial communication models—such as HTTP with the addition of the "100 Continue" mechanism. As a case study, we examine several versions of the specification for the Continue mechanism; our experiments mechanically uncovered multi-version interoperability problems, including some which motivated revisions of HTTP/1.1 and some which persist even with the current version of the protocol. One such problem resembles a classic degradation-of-service attack, but can arise between well-meaning peers. We also discuss how the methods we employ can be used to make explicit the requirements for hardening a protocol's implementation against potentially malicious peers, and for verifying an implementation's interoperability with the full range of allowable peer behaviors.
Resumo:
The performance of different classification approaches is evaluated using a view-based approach for motion representation. The view-based approach uses computer vision and image processing techniques to register and process the video sequence. Two motion representations called Motion Energy Images and Motion History Image are then constructed. These representations collapse the temporal component in a way that no explicit temporal analysis or sequence matching is needed. Statistical descriptions are then computed using moment-based features and dimensionality reduction techniques. For these tests, we used 7 Hu moments, which are invariant to scale and translation. Principal Components Analysis is used to reduce the dimensionality of this representation. The system is trained using different subjects performing a set of examples of every action to be recognized. Given these samples, K-nearest neighbor, Gaussian, and Gaussian mixture classifiers are used to recognize new actions. Experiments are conducted using instances of eight human actions (i.e., eight classes) performed by seven different subjects. Comparisons in the performance among these classifiers under different conditions are analyzed and reported. Our main goals are to test this dimensionality-reduced representation of actions, and more importantly to use this representation to compare the advantages of different classification approaches in this recognition task.
Resumo:
Previous studies have reported considerable intersubject variability in the three-dimensional geometry of the human primary visual cortex (V1). Here we demonstrate that much of this variability is due to extrinsic geometric features of the cortical folds, and that the intrinsic shape of V1 is similar across individuals. V1 was imaged in ten ex vivo human hemispheres using high-resolution (200 μm) structural magnetic resonance imaging at high field strength (7 T). Manual tracings of the stria of Gennari were used to construct a surface representation, which was computationally flattened into the plane with minimal metric distortion. The instrinsic shape of V1 was determined from the boundary of the planar representation of the stria. An ellipse provided a simple parametric shape model that was a good approximation to the boundary of flattened V1. The aspect ration of the best-fitting ellipse was found to be consistent across subject, with a mean of 1.85 and standard deviation of 0.12. Optimal rigid alignment of size-normalized V1 produced greater overlap than that achieved by previous studies using different registration methods. A shape analysis of published macaque data indicated that the intrinsic shape of macaque V1 is also stereotyped, and similar to the human V1 shape. Previoud measurements of the functional boundary of V1 in human and macaque are in close agreement with these results.
Resumo:
The 2-channel Ellias-Grossberg neural pattern generator of Cohen, Grossberg, and Pribe [1] is shown to simulate data from human bimanual coordination tasks in which anti-phase oscillations at low frequencies spontaneously switch to in-phase oscillations at high frequencies, in-phase oscillations can be performed at both low and high frequencies, phase fluctuations occur at the anti-phase to in-phase transition, and a "seagull effect" of larger errors occurs at intermediate phases.
Resumo:
A biomechanical model of the human oculomotor plant kinematics in 3-D as a function of muscle length changes is presented. It can represent a range of alternative interpretations of the data as a function of one parameter. The model is free from such deficits as singularities and the nesting of axes found in alternative formulations such as the spherical wrist (Paul, l98l). The equations of motion are defined on a quaternion based representation of eye rotations and are compact atnd computationally efficient.
Resumo:
A notable feature of the surveillance case law of the European Court of Human Rights has been the tendency of the Court to focus on the “in accordance with the law” aspect of the Article 8 ECHR inquiry. This focus has been the subject of some criticism, but the impact of this approach on the manner in which domestic surveillance legislation has been formulated in the Party States has received little scholarly attention. This thesis addresses that gap in the literature through its consideration of the Interception of Postal Packets and Telecommunications Messages (Regulation) Act, 1993 and the Criminal Justice (Surveillance) Act, 2009. While both Acts provide several of the safeguards endorsed by the European Court of Human Rights, this thesis finds that they suffer from a number of crucial weaknesses that undermine the protection of privacy. This thesis demonstrates how the focus of the European Court of Human Rights on the “in accordance with the law” test has resulted in some positive legislative change. Notwithstanding this fact, it is maintained that the legality approach has gained prominence at the expense of a full consideration of the “necessary in a democratic society” inquiry. This has resulted in superficial legislative responses at the domestic level, including from the Irish government. Notably, through the examination of a number of more recent cases, this project discerns a significant alteration in the interpretive approach adopted by the European Court of Human Rights regarding the application of the necessity test. The implications of this development are considered and the outlook for Irish surveillance legislation is assessed.
Resumo:
Numerous laboratory experiments have been performed in an attempt to mimic atmospheric secondary organic aerosol (SOA) formation. However, it is still unclear how close the aerosol particles generated in laboratory experiments resemble atmospheric SOA with respect to their detailed chemical composition. In this study, we generated SOA in a simulation chamber from the ozonolysis of α-pinene and a biogenic volatile organic compound (BVOC) mixture containing α- and β-pinene, Δ3-carene, and isoprene. The detailed molecular composition of laboratory-generated SOA was compared with that of background ambient aerosol collected at a boreal forest site (Hyytiälä, Finland) and an urban location (Cork, Ireland) using direct infusion nanoelectrospray ultrahigh resolution mass spectrometry. Kendrick Mass Defect and Van Krevelen approaches were used to identify and compare compound classes and distributions of the detected species. The laboratory-generated SOA contained a distinguishable group of dimers that was not observed in the ambient samples. The presence of dimers was found to be less pronounced in the SOA from the VOC mixtures when compared to the one component precursor system. The elemental composition of the compounds identified in the monomeric region from the ozonolysis of both α-pinene and VOC mixtures represented the ambient organic composition of particles collected at the boreal forest site reasonably well, with about 70% of common molecular formulae. In contrast, large differences were found between the laboratory-generated BVOC samples and the ambient urban sample. To our knowledge this is the first direct comparison of molecular composition of laboratory-generated SOA from BVOC mixtures and ambient samples.
Resumo:
This thesis traces a genealogy of the discourse of mathematics education reform in Ireland at the beginning of the twenty first century at a time when the hegemonic political discourse is that of neoliberalism. It draws on the work of Michel Foucault to identify the network of power relations involved in the development of a single case of curriculum reform – in this case Project Maths. It identifies the construction of an apparatus within the fields of politics, economics and education, the elements of which include institutions like the OECD and the Government, the bureaucracy, expert groups and special interest groups, the media, the school, the State, state assessment and international assessment. Five major themes in educational reform emerge from the analysis: the arrival of neoliberal governance in Ireland; the triumph of human capital theory as the hegemonic educational philosophy here; the dominant role of OECD/PISA and its values in the mathematics education discourse in Ireland; the fetishisation of western scientific knowledge and knowledge as commodity; and the formation of a new kind of subjectivity, namely the subjectivity of the young person as a form of human-capital-to-be. In particular, it provides a critical analysis of the influence of OECD/PISA on the development of mathematics education policy here – especially on Project Maths curriculum, assessment and pedagogy. It unpacks the arguments in favour of curriculum change and lays bare their ideological foundations. This discourse contextualises educational change as occurring within a rapidly changing economic environment where the concept of the State’s economic aspirations and developments in science, technology and communications are reshaping both the focus of business and the demands being put on education. Within this discourse, education is to be repurposed and its consequences measured against the paradigm of the Knowledge Economy – usually characterised as the inevitable or necessary future of a carefully defined present.
Resumo:
To evaluate the immunogenicity and safety of a 23-valent pneumococcal vaccine in human immunodeficiency virus (HIV)-seropositive patients, 80 men and 18 women received 1 dose of the vaccine (Pneumo 23; Pasteur Mérieux MSD, Brussels). The total IgG antibody response against all 23 Streptococcus pneumoniae capsular antigens was measured. Antibody levels were expressed in arbitrary units per microliter, referring to a standard curve. Geometric mean titers of the total IgG capsular antibodies on the day of vaccination and 30-45 days later were compared. The ratios of titers after and before vaccination in patients with > 500, 200-500, and < 200 CD4 lymphocytes/microL were 10, 10, and 12.6, respectively. Nonresponse (ratio < 4) occurred in 17% of patients and was unrelated to CD4 cell count. The vaccine was well tolerated; no serious side effects occurred. In 83% of the patients with HIV infection, the total antipneumococcal IgG level was higher after vaccination.
Resumo:
Bacterial lipopolysaccharide (endotoxin) is a frequent contaminant of biological specimens and is also known to be a potent inducer of beta-chemokines and other soluble factors that inhibit HIV-1 infection in vitro. Though lipopolysaccharide (LPS) has been shown to stimulate the production of soluble HIV-1 inhibitors in cultures of monocyte-derived macrophages, the ability of LPS to induce similar inhibitors in other cell types is poorly characterized. Here we show that LPS exhibits potent anti-HIV activity in phytohemagglutinin-stimulated peripheral blood mononuclear cells (PBMCs) but has no detectable anti-HIV-1 activity in TZM-bl cells. The anti-HIV-1 activity of LPS in PBMCs was strongly associated with the production of beta-chemokines from CD14-positive monocytes. Culture supernatants from LPS-stimulated PBMCs exhibited potent anti-HIV-1 activity when added to TZM-bl cells but, in this case, the antiviral activity appeared to be related to IFN-gamma rather than to beta-chemokines. These observations indicate that LPS stimulates PBMCs to produce a complex array of soluble HIV-1 inhibitors, including beta-chemokines and IFN-gamma, that differentially inhibit HIV-1 depending on the target cell type. The results also highlight the need to use endotoxin-free specimens to avoid artifacts when assessing HIV-1-specific neutralizing antibodies in PBMC-based assays.