822 resultados para Lazzard, Gilbert: Actancy. Empirical approaches to language typology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described in this PhD thesis focuses on proteomics approaches to study the effect of oxidation on the modification status and protein-protein interactions of PTEN, a redox-sensitive phosphatase involved in a number of cellular processes including metabolism, apoptosis, cell proliferation, and survival. While direct evidence of a redox regulation of PTEN and its downstream signaling has been reported, the effect of cellular oxidative stress or direct PTEN oxidation on PTEN structure and interactome is still poorly defined. In a first study, GST-tagged PTEN was directly oxidized over a range of hypochlorous acid (HOCl) concentration, assayed for phosphatase activity, and oxidative post-translational modifications (oxPTMs) were quantified using LC-MS/MS-based label-free methods. In a second study, GSTtagged PTEN was prepared in a reduced and reversibly H2O2-oxidized form, immobilized on a resin support and incubated with HCT116 cell lysate to capture PTEN interacting proteins, which were analyzed by LC-MS/MS and comparatively quantified using label-free methods. In parallel experiments, HCT116 cells transfected with a GFP-tagged PTEN were treated with H2O2 and PTENinteracting proteins immunoprecipitated using standard methods. Several high abundance HOCl-induced oxPTMs were mapped, including those taking place at amino acids known to be important for PTEN phosphatase activity and protein-protein interactions, such as Met35, Tyr155, Tyr240 and Tyr315. A PTEN redox interactome was also characterized, which identified a number of PTEN-interacting proteins that vary with the reversible inactivation of PTEN caused by H2O2 oxidation. These included new PTEN interactors as well as the redox proteins peroxiredoxin-1 (Prdx1) and thioredoxin (Trx), which are known to be involved in the recycling of PTEN active site following H2O2-induced reversible inactivation. The results suggest that the oxidative modification of PTEN causes functional alterations in PTEN structure and interactome, with fundamental implications for the PTEN signaling role in many cellular processes, such as those involved in the pathophysiology of disease and ageing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tanulmány a Public-Private Partnership (PPP) egyik nagy kérdésének megközelítéseit vizsgálja: miként védhető a közérdek e projektekben. Piaci és nem piaci megoldásokat tesz az elemzés mérlegre, valamint kitér arra, hogy miért különleges a PPP projektek esetében a közérdek védelmének kérdése. A szabályozott verseny körülményeinek kialakítása több megközelítésben is perdöntő kérdés a PPP értéknövelésének előmozdításához, bár a létező megoldások nem mentesek anomáliáktól. A képviseleti demokrácia intézményi működésének támogatására pedig a társadalmi részvétel megoldásait javasolja az irodalom. E megközelítés is több formájában, többféle céllal és szintén kihívásokkal segítheti az értéknövelő PPP projekteket. A tanulmány az elvi lehetőségek értékelő elemzését követően a megvalósítás realitásait is mérlegre teszi. = This study focuses on a key issue in Public-Private Partnership (PPP) projects: how may public interest be protected. It assesses market based and non market based approaches, and also explains why PPP projects are peculiar when addressing the protection of public interest. Setting up the conditions for simulated competition is of paramount importance for different reasons in order to enable value creating PPP projects. Existing solutions however are not without anomalies. To promote the institutions of democracy, participatory solutions are recommended in the literature. That approach may help value creating PPP projects in various forms, with a range of objectives and challenges. The study concludes the analytical assessment of options by highlighting the realistic conditions of implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A szerző a tisztán elméleti célokra kifejlesztett Neumann-modellt és a gyakorlati alkalmazások céljára kifejlesztett Leontief-modellt veti össze. A Neumann-modell és a Leontief-modell éves termelési periódust feltételező, zárt, stacionárius változatának hasonló matematikai struktúrája azt a feltételezést sugallja, hogy az utóbbi a Neumann-modell sajátos eseteként értelmezhető. Az egyes modellek közgazdasági tartalmát és feltevéseit részletesen kibontva és egymással összevetve a szerző megmutatja, hogy a fenti következtetés félrevezető, két merőben különböző modellről van szó, nem lehet az egyikből a másikat levezetni. Az ikertermelés és technológiai választék lehetősége a Neumann-modell elengedhetetlen feltevése, az éves termelési periódus feltevése pedig kizárja folyam jellegű kibocsátások explicit figyelembevételét. Mindezek feltevések ugyanakkor idegenek a Leontief-modelltől. A két modell valójában egy általánosabb állomány–folyam jellegű zárt, stacionárius modell sajátos esete, méghozzá azok folyamváltozókra redukált alakja. _____ The paper compares the basic assumptions and methodology of the Von Neumann model, developed for purely abstract theoretical purposes, and those of the Leontief model, designed originally for practical applications. Study of the similar mathematical structures of the Von Neumann model and the closed, stationary Leontief model, with a unit length of production period, often leads to the false conclusion that the latter is just a simplified version of the former. It is argued that the economic assumptions of the two models are quite different, which makes such an assertion unfounded. Technical choice and joint production are indispensable features of the Von Neumann model, and the assumption of unitary length of production period excludes the possibility of taking service flows explicitly into account. All these features are completely alien to the Leontief model, however. It is shown that the two models are in fact special cases of a more general stock-flow stationary model, reduced to forms containing only flow variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current views of the nature of knowledge and of learning suggest that instructional approaches in science education pay closer attention to how students learn rather than on teaching. This study examined the use of approaches to teaching science based on two contrasting perspectives in learning, social constructivist and traditional, and the effects they have on students' attitudes and achievement. Four categories of attitudes were measured using the Upper Secondary Attitude Questionnaire: Attitude towards school, towards the importance of science, towards science as a career, and towards science as a subject in school. Achievement was measured by average class grades and also with a researcher/teacher constructed 30-item test that involved three sub-scales of items based on knowledge, and applications involving near-transfer and far-transfer of concepts. The sample consisted of 202 students in nine intact classrooms in chemistry at a large high school in Miami, Florida, and involved two teachers. Results were analyzed using a two-way analysis of covariance (ANCOVA) with a pretest in attitude as the covariate for attitudes and prior achievement as the covariate for achievement. A comparison of the adjusted mean scores was made between the two groups and between females and males. ^ With constructivist-based teaching, students showed more favorable attitude towards science as a subject, obtained significantly higher scores in class achievement, total achievement and achievement on the knowledge sub-scale of the knowledge and application test. Students in the traditional group showed more favorable attitude towards school. Females showed significantly more positive attitude towards the importance of science and obtained significantly higher scores in class achievement. No significant interaction effects were obtained for method of instruction by gender. ^ This study lends some support to the view that constructivist-based approaches to teaching science is a viable alternative to traditional modes of teaching. It is suggested that in science education, more consideration be given to those aspects of classroom teaching that foster closer coordination between social influences and individual learning. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The accurate and reliable estimation of travel time based on point detector data is needed to support Intelligent Transportation System (ITS) applications. It has been found that the quality of travel time estimation is a function of the method used in the estimation and varies for different traffic conditions. In this study, two hybrid on-line travel time estimation models, and their corresponding off-line methods, were developed to achieve better estimation performance under various traffic conditions, including recurrent congestion and incidents. The first model combines the Mid-Point method, which is a speed-based method, with a traffic flow-based method. The second model integrates two speed-based methods: the Mid-Point method and the Minimum Speed method. In both models, the switch between travel time estimation methods is based on the congestion level and queue status automatically identified by clustering analysis. During incident conditions with rapidly changing queue lengths, shock wave analysis-based refinements are applied for on-line estimation to capture the fast queue propagation and recovery. Travel time estimates obtained from existing speed-based methods, traffic flow-based methods, and the models developed were tested using both simulation and real-world data. The results indicate that all tested methods performed at an acceptable level during periods of low congestion. However, their performances vary with an increase in congestion. Comparisons with other estimation methods also show that the developed hybrid models perform well in all cases. Further comparisons between the on-line and off-line travel time estimation methods reveal that off-line methods perform significantly better only during fast-changing congested conditions, such as during incidents. The impacts of major influential factors on the performance of travel time estimation, including data preprocessing procedures, detector errors, detector spacing, frequency of travel time updates to traveler information devices, travel time link length, and posted travel time range, were investigated in this study. The results show that these factors have more significant impacts on the estimation accuracy and reliability under congested conditions than during uncongested conditions. For the incident conditions, the estimation quality improves with the use of a short rolling period for data smoothing, more accurate detector data, and frequent travel time updates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Now that baby boomers are older and pursuing more career-oriented jobs, managers of the hospitality industry are experiencing the effects of the pre- sent labor crisis; they now know that those vacant hourly jobs are going to be tough to fill with quality personnel. The companies able to attract quality personnel by offering employees what they need and want will be the successful ones in the next decade. The authors explain how the labor crisis is currently affecting the hospitality industry and make suggestions about how firms may survive the "labor crash” of the 1990s with the application of marketing technology to human resource management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical therapy students must apply the relevant information learned in their academic and clinical experience to problem solve in treating patients. I compared the clinical cognitive competence in patient care of second-year masters students enrolled in two different curricular programs: modified problem-based (M P-B; n = 27) and subject-centered (S-C; n = 41). Main features of S-C learning include lecture and demonstration as the major teaching strategies and no exposure to patients or problem solving learning until the sciences (knowledge) have been taught. Comparatively, main features of M P-B learning include case study in small student groups as the main teaching strategy, early and frequent exposure to patients, and knowledge and problem solving skills learned together for each specific case. Basic and clinical orthopedic knowledge was measured with a written test with open-ended items. Problem solving skills were measured with a written case study patient problem test yielding three subscores: assessment, problem identification, and treatment planning. ^ Results indicated that among the demographic and educational characteristics analyzed, there was a significant difference between groups on ethnicity, bachelor degree type, admission GPA, and current GPA, but there was no significant difference on gender, age, possession of a physical therapy assistant license, and GRE score. In addition, the M P-B group achieved a significantly higher adjusted mean score on the orthopedic knowledge test after controlling for GRE scores. The S-C group achieved a significantly higher adjusted mean total score and treatment management subscore on the case study test after controlling for orthopedic knowledge test scores. These findings did not support their respective research hypotheses. There was no significant difference between groups on the assessment and problem identification subscores of the case study test. The integrated M P-B approach promoted superior retention of basic and clinical science knowledge. The results on problem solving skills were mixed. The S-C approach facilitated superior treatment planning skills, but equivalent patient assessment and problem identification skills by emphasizing all equally and exposing the students to more patients with a wider variety of orthopedic physical therapy needs than in the M P-B approach. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low-rise buildings are often subjected to high wind loads during hurricanes that lead to severe damage and cause water intrusion. It is therefore important to estimate accurate wind pressures for design purposes to reduce losses. Wind loads on low-rise buildings can differ significantly depending upon the laboratory in which they were measured. The differences are due in large part to inadequate simulations of the low-frequency content of atmospheric velocity fluctuations in the laboratory and to the small scale of the models used for the measurements. A new partial turbulence simulation methodology was developed for simulating the effect of low-frequency flow fluctuations on low-rise buildings more effectively from the point of view of testing accuracy and repeatability than is currently the case. The methodology was validated by comparing aerodynamic pressure data for building models obtained in the open-jet 12-Fan Wall of Wind (WOW) facility against their counterparts in a boundary-layer wind tunnel. Field measurements of pressures on Texas Tech University building and Silsoe building were also used for validation purposes. The tests in partial simulation are freed of integral length scale constraints, meaning that model length scales in such testing are only limited by blockage considerations. Thus the partial simulation methodology can be used to produce aerodynamic data for low-rise buildings by using large-scale models in wind tunnels and WOW-like facilities. This is a major advantage, because large-scale models allow for accurate modeling of architectural details, testing at higher Reynolds number, using greater spatial resolution of the pressure taps in high pressure zones, and assessing the performance of aerodynamic devices to reduce wind effects. The technique eliminates a major cause of discrepancies among measurements conducted in different laboratories and can help to standardize flow simulations for testing residential homes as well as significantly improving testing accuracy and repeatability. Partial turbulence simulation was used in the WOW to determine the performance of discontinuous perforated parapets in mitigating roof pressures. The comparisons of pressures with and without parapets showed significant reductions in pressure coefficients in the zones with high suctions. This demonstrated the potential of such aerodynamic add-on devices to reduce uplift forces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conjugated polymers (CPs) are intrinsically fluorescent materials that have been used for various biological applications including imaging, sensing, and delivery of biologically active substances. The synthetic control over flexibility and biodegradability of these materials aids the understanding of the structure-function relationships among the photophysical properties, the self-assembly behaviors of the corresponding conjugated polymer nanoparticles (CPNs), and the cellular behaviors of CPNs, such as toxicity, cellular uptake mechanisms, and sub-cellular localization patterns. Synthetic approaches towards two classes of flexible CPs with well-preserved fluorescent properties are described. The synthesis of flexible poly(p-phenylenebutadiynylene)s (PPBs) uses competing Sonogashira and Glaser coupling reactions and the differences in monomer reactivity to incorporate a small amount (~10%) of flexible, non-conjugated linkers into the backbone. The reaction conditions provide limited control over the proportion of flexible monomer incorporation. Improved synthetic control was achieved in a series of flexible poly(p-phenyleneethynylene)s (PPEs) using modified Sonogashira conditions. In addition to controlling the degree of flexibility, the linker provides disruption of backbone conjugation that offers control of the length of conjugated segments within the polymer chain. Therefore, such control also results in the modulation of the photophysical properties of the materials. CPNs fabricated from flexible PPBs are non-toxic to cells, and exhibit subcellular localization patterns clearly different from those observed with non-flexible PPE CPNs. The subcellular localization patterns of the flexible PPEs have not yet been determined, due to the toxicity of the materials, most likely related to the side-chain structure used in this series. The study of the effect of CP flexibility on self-assembly reorganization upon polyanion complexation is presented. Owing to its high rigidity and hydrophobicity, the PPB backbone undergoes reorganization more readily than PPE. The effects are enhanced in the presence of the flexible linker, which enables more efficient π-π stacking of the aromatic backbone segments. Flexibility has minimal effects on the self-assembly of PPEs. Understanding the role of flexibility on the biophysical behaviors of CPNs is key to the successful development of novel efficient fluorescent therapeutic delivery vehicles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Omnibus tests of significance in contingency tables use statistics of the chi-square type. When the null is rejected, residual analyses are conducted to identify cells in which observed frequencies differ significantly from expected frequencies. Residual analyses are thus conditioned on a significant omnibus test. Conditional approaches have been shown to substantially alter type I error rates in cases involving t tests conditional on the results of a test of equality of variances, or tests of regression coefficients conditional on the results of tests of heteroscedasticity. We show that residual analyses conditional on a significant omnibus test are also affected by this problem, yielding type I error rates that can be up to 6 times larger than nominal rates, depending on the size of the table and the form of the marginal distributions. We explored several unconditional approaches in search for a method that maintains the nominal type I error rate and found out that a bootstrap correction for multiple testing achieved this goal. The validity of this approach is documented for two-way contingency tables in the contexts of tests of independence, tests of homogeneity, and fitting psychometric functions. Computer code in MATLAB and R to conduct these analyses is provided as Supplementary Material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transducer function mu for contrast perception describes the nonlinear mapping of stimulus contrast onto an internal response. Under a signal detection theory approach, the transducer model of contrast perception states that the internal response elicited by a stimulus of contrast c is a random variable with mean mu(c). Using this approach, we derive the formal relations between the transducer function, the threshold-versus-contrast (TvC) function, and the psychometric functions for contrast detection and discrimination in 2AFC tasks. We show that the mathematical form of the TvC function is determined only by mu, and that the psychometric functions for detection and discrimination have a common mathematical form with common parameters emanating from, and only from, the transducer function mu and the form of the distribution of the internal responses. We discuss the theoretical and practical implications of these relations, which have bearings on the tenability of certain mathematical forms for the psychometric function and on the suitability of empirical approaches to model validation. We also present the results of a comprehensive test of these relations using two alternative forms of the transducer model: a three-parameter version that renders logistic psychometric functions and a five-parameter version using Foley's variant of the Naka-Rushton equation as transducer function. Our results support the validity of the formal relations implied by the general transducer model, and the two versions that were contrasted account for our data equally well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funding sources: The study was funded by a research grant from the Chief Scientist’s Office of the Scottish Government Health and Social Care Directorates (CZH/4/971). The funder played no role in study design, data collection, data analysis, manuscript preparation and/or publication decisions. The views expressed herein are those of the authors and do not necessarily reflect those of the funder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although atypical social behaviour remains a key characterisation of ASD, the presence ofsensory and perceptual abnormalities has been given a more central role in recentclassification changes. An understanding of the origins of such aberrations could thus prove afruitful focus for ASD research. Early neurocognitive models of ASD suggested that thestudy of high frequency activity in the brain as a measure of cortical connectivity mightprovide the key to understanding the neural correlates of sensory and perceptual deviations inASD. As our review shows, the findings from subsequent research have been inconsistent,with a lack of agreement about the nature of any high frequency disturbances in ASD brains.Based on the application of new techniques using more sophisticated measures of brainsynchronisation, direction of information flow, and invoking the coupling between high andlow frequency bands, we propose a framework which could reconcile apparently conflictingfindings in this area and would be consistent both with emerging neurocognitive models ofautism and with the heterogeneity of the condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bet-hedging strategies are used by organisms to survive in

unpredictable environments. To pursue a bet-hedging strategy, an

organism must produce multiple phenotypes from a single genotype. What

molecular mechanisms allow this to happen? To address this question, I

created a synthetic system that displays bet-hedging behavior, and

developed a new technique called `TrackScar' to measure the fitness

and stress-resistance of individual cells. I found that bet-hedging

can be generated by actively sensing the environment, and that

bet-hedging strategies based on active sensing need not be

metabolically costly. These results suggest that to understand how

bet-hedging strategies are produced, microorganisms must be

examined in the actual environments that they come from.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this dissertation, I explore the impact of several public policies on civic participation. Using a unique combination of school administrative and public–use voter files and methods for causal inference, I evaluate the impact of three new, as of yet unexplored, policies: one informational, one institutional, and one skill–based. Chapter 2 examines the causal effect of No Child Left Behind’s performance-based accountability school failure signals on turnout in school board elections and on individuals’ use of exit. I find that failure signals mobilize citizens both at the ballot box and by encouraging them to vote with their feet. However, these increases in voice and exit come primarily from citizens who already active—thus exacerbating inequalities in both forms of participation. Chapter 3 examines the causal effect of preregistration—an electoral reform that allows young citizens to enroll in the electoral system before turning 18, while also providing them with various in-school supports. Using data from the Current Population Survey and Florida Voter Files and multiple methods for causal inference, I (with my coauthor listed below) show that preregistration mobilizes and does so for a diverse set of citizens. Finally, Chapter 4 examines the impact of psychosocial or so called non-cognitive skills on voter turnout. Using information from the Fast Track intervention, I show that early– childhood investments in psychosocial skills have large, long-run spillovers on civic participation. These gains are widely distributed, being especially large for those least likely to participate. These chapters provide clear insights that reach across disciplinary boundaries and speak to current policy debates. In placing specific attention not only on whether these programs mobilize, but also on who they mobilize, I provide scholars and practitioners with new ways of thinking about how to address stubbornly low and unequal rates of citizen engagement.