884 resultados para unified theories and models of strong and electroweak
Resumo:
2000 Mathematics Subject Classification: Primary 47A20, 47A45; Secondary 47A48.
Resumo:
Jelen tanulmány a posztmodern kor fogyasztási tendenciáit és a posztmodern marketing sajátos fejlődését elemzi, elsősorban a turizmus példáján. A szerzők a hazai és a nemzetközi szakirodalom, illetve saját kutatásaik és megfigyeléseik alapján ütköztetik az ismert és elfogadott elveket, elméleteket a gyakorlattal, és felhívják a figyelmet a marketingtevékenység alkalmazkodásának hazai problémáira. A Vezetéstudomány című folyóirat 2008/9. számában rendkívül érdekes tanulmány jelent meg Mitev Ariel Zoltán és Horváth Dóra tollából „A posztmodern marketing rózsaszirmai” címmel. A tanulmány előremutató, érdekfeszítő és minden tekintetben konstruktív, újszerű. Jelen tanulmány szerzőire is nagy hatást gyakorolt a cikk, nagyrészt felsorolt erényei miatt, de egyes esetekben kiegészítést kívánva. Mindenképpen inspirálta a továbblépést, az újabb adalékok megfogalmazását, amire ezúton e tanulmány szerzői kísérletet tettek. A cikk egyben szerves gondolati folytatása a szerzőpáros korábbi közös publikációinak, elsősorban a Marketing & Menedzsment folyóiratban megjelent cikknek. _______ In this article the author will analyze consumption tendencies of post-modern age, mainly using tourism marketing examples. Their analysis has been based on results of their own researches and researches published in Hungarian and international marketing literature. In this article they try to confront different theories of post-modern marketing and they will analyze problems of applicability of these theories in Hungarian marketing problem solving. An extremely interesting article was published in Vezetéstudomány (2008/9), written by Zoltán Mitev Ariel and Dóra Horváth, and this article, by its interesting, innovative and constructive aspect has largely influenced authors of present article to continue the path proposed in the abovementioned article. The article, in the same time, is an organic continuation of the earlier common publications of the authors, e.g. the recent article in Marketing & Menedzsment journal.
Resumo:
In this study, discrete time one-factor models of the term structure of interest rates and their application to the pricing of interest rate contingent claims are examined theoretically and empirically. The first chapter provides a discussion of the issues involved in the pricing of interest rate contingent claims and a description of the Ho and Lee (1986), Maloney and Byrne (1989), and Black, Derman, and Toy (1990) discrete time models. In the second chapter, a general discrete time model of the term structure from which the Ho and Lee, Maloney and Byrne, and Black, Derman, and Toy models can all be obtained is presented. The general model also provides for the specification of an additional model, the ExtendedMB model. The third chapter illustrates the application of the discrete time models to the pricing of a variety of interest rate contingent claims. In the final chapter, the performance of the Ho and Lee, Black, Derman, and Toy, and ExtendedMB models in the pricing of Eurodollar futures options is investigated empirically. The results indicate that the Black, Derman, and Toy and ExtendedMB models outperform the Ho and Lee model. Little difference in the performance of the Black, Derman, and Toy and ExtendedMB models is detected. ^
Resumo:
This study investigated the use of treatment theories and procedures for postural control training used by Occupational Therapists (OTs) when working with hemiplegic adults who have had cerebrovascular accident (CVA) or traumatic brain injury (TBI). The method of data collection was a national survey of 400 randomly selected physical disability OTs with 127 usable surveys returned. Results showed that the most common used treatment theory was neurodevelopmental treatment (NDT), followed by motor relearning program (MRP), proprioceptive neuromuscular facilitation (PNF), Brunnstrom's approach, and the approach of Rood. The most common treatment posture used was sitting, followed by standing, mat activity, equilibrium reaction training, and walking. The factors affecting the use of various treatment theories procedures were years certified, years of clinical experience, work situation and work status. Pearson correlation coefficient analyses found significant positive relationships between treatment theories and postures. There were significant high correlations between usage of all pairs of treatment procedures. ^
Resumo:
The goal of mangrove restoration projects should be to improve community structure and ecosystem function of degraded coastal landscapes. This requires the ability to forecast how mangrove structure and function will respond to prescribed changes in site conditions including hydrology, topography, and geophysical energies. There are global, regional, and local factors that can explain gradients of regulators (e.g., salinity, sulfides), resources (nutrients, light, water), and hydroperiod (frequency, duration of flooding) that collectively account for stressors that result in diverse patterns of mangrove properties across a variety of environmental settings. Simulation models of hydrology, nutrient biogeochemistry, and vegetation dynamics have been developed to forecast patterns in mangroves in the Florida Coastal Everglades. These models provide insight to mangrove response to specific restoration alternatives, testing causal mechanisms of system degradation. We propose that these models can also assist in selecting performance measures for monitoring programs that evaluate project effectiveness. This selection process in turn improves model development and calibration for forecasting mangrove response to restoration alternatives. Hydrologic performance measures include soil regulators, particularly soil salinity, surface topography of mangrove landscape, and hydroperiod, including both the frequency and duration of flooding. Estuarine performance measures should include salinity of the bay, tidal amplitude, and conditions of fresh water discharge (included in the salinity value). The most important performance measures from the mangrove biogeochemistry model should include soil resources (bulk density, total nitrogen, and phosphorus) and soil accretion. Mangrove ecology performance measures should include forest dimension analysis (transects and/or plots), sapling recruitment, leaf area index, and faunal relationships. Estuarine ecology performance measures should include the habitat function of mangroves, which can be evaluated with growth rate of key species, habitat suitability analysis, isotope abundance of indicator species, and bird census. The list of performance measures can be modified according to the model output that is used to define the scientific goals during the restoration planning process that reflect specific goals of the project.
Resumo:
Effective treatment of sensory neuropathies in peripheral neuropathies and spinal cord injury (SCI) is one of the most difficult problems in modern clinical practice. Cell therapy to release antinociceptive agents near the injured spinal cord is a logical next step in the development of treatment modalities. But few clinical trials, especially for chronic pain, have tested the potential of transplant of cells to treat chronic pain. Cell lines derived from the human neuronal NT2 cell line parentage, the hNT2.17 and hNT2.19 lines, which synthesize and release the neurotransmitters gamma-aminobutyric acid (GABA) and serotonin (5HT), respectively, have been used to evaluate the potential of cell-based release of antinociceptive agents near the lumbar dorsal (horn) spinal sensory cell centers to relieve neuropathic pain after PNS (partial nerve and diabetes-related injury) and CNS (spinal cord injury) damage in rat models. Both cell lines transplants potently and permanently reverse behavioral hypersensitivity without inducing tumors or other complications after grafting. Functioning as cellular minipumps for antinociception, human neuronal precursors, like these NT2-derived cell lines, would likely provide a useful adjuvant or replacement for current pharmacological treatments for neuropathic pain.
Resumo:
A pre-test, post-test, quasi-experimental design was used to examine the effects of student-centered and traditional models of reading instruction on outcomes of literal comprehension and critical thinking skills. The sample for this study consisted of 101 adult students enrolled in a high-level developmental reading course at a large, urban community college in the Southeastern United States. The experimental group consisted of 48 students, and the control group consisted of 53 students. Students in the experimental group were limited in the time spent reading a course text of basic skills, with instructors using supplemental materials such as poems, news articles, and novels. Discussions, the reading-writing connection, and student choice in material selection were also part of the student-centered curriculum. Students in the control group relied heavily on a course text and vocabulary text for reading material, with great focus placed on basic skills. Activities consisted primarily of multiple-choice questioning and quizzes. The instrument used to collect pre-test data was Descriptive Tests of Language Skills in Reading Comprehension; post-test data were taken from the Florida College Basic Skills Exit Test. A MANCOVA was used as the statistical method to determine if either model of instruction led to significantly higher gains in literal comprehension skills or critical thinking skills. A paired samples t-test was also used to compare pre-test and post-test means. The results of the MANCOVA indicated no significant difference between instructional models on scores of literal comprehension and critical thinking. Neither was there any significant difference in scores between subgroups of age (under 25 and 25 and older) and language background (native English speaker and second-language learner). The results of the t-test indicated, however, that students taught under both instructional models made significant gains in on both literal comprehension and critical thinking skills from pre-test to post-test.
Resumo:
Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: (1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (E LUMO) via QSAR modelling and analysis; (2) to validate the models by using internal and external cross-validation techniques; (3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl ) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: (1) Linear or Multi-linear Regression (MLR); (2) Partial Least Squares (PLS); and (3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: (1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; (2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; (3) E LUMO are shown to correlate highly with the NCl for several classes of DBPs; and (4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.
Resumo:
The sedimentary sections of three cores from the Celtic margin provide high-resolution records of the terrigenous fluxes during the last glacial cycle. A total of 21 14C AMS dates allow us to define age models with a resolution better than 100 yr during critical periods such as Heinrich events 1 and 2. Maximum sedimentary fluxes occurred at the Meriadzek Terrace site during the Last Glacial Maximum (LGM). Detailed X-ray imagery of core MD95-2002 from the Meriadzek Terrace shows no sedimentary structures suggestive of either deposition from high-density turbidity currents or significant erosion. Two paroxysmal terrigenous flux episodes have been identified. The first occurred after the deposition of Heinrich event 2 Canadian ice-rafted debris (IRD) and includes IRD from European sources. We suggest that the second represents an episode of deposition from turbid plumes, which precedes IRD deposition associated with Heinrich event 1. At the end of marine isotopic stage 2 (MIS 2) and the beginning of MIS 1 the highest fluxes are recorded on the Whittard Ridge where they correspond to deposition from turbidity current overflows. Canadian icebergs have rafted debris at the Celtic margin during Heinrich events 1, 2, 4 and 5. The high-resolution records of Heinrich events 1 and 2 show that in both cases the arrival of the Canadian icebergs was preceded by a European ice rafting precursor event, which took place about 1-1.5 kyr before. Two rafting episodes of European IRD also occurred immediately after Heinrich event 2 and just before Heinrich event 1. The terrigenous fluxes recorded in core MD95-2002 during the LGM are the highest reported at hemipelagic sites from the northwestern European margin. The magnitude of the Canadian IRD fluxes at Meriadzek Terrace is similar to those from oceanic sites.
Resumo:
The rainbow smelt (Osmerus mordax) is an anadromous teleost that produces type II antifreeze protein (AFP) and accumulates modest urea and high glycerol levels in plasma and tissues as adaptive cryoprotectant mechanisms in sub-zero temperatures. It is known that glyceroneogenesis occurs in liver via a branch in glycolysis and gluconeogenesis and is activated by low temperature; however, the precise mechanisms of glycerol synthesis and trafficking in smelt remain to be elucidated. The objective of this thesis was to provide further insight using functional genomic techniques [e.g. suppression subtractive hybridization (SSH) cDNA library construction, microarray analyses] and molecular analyses [e.g. cloning, quantitative reverse transcription - polymerase chain reaction (QPCR)]. Novel molecular mechanisms related to glyceroneogenesis were deciphered by comparing the transcript expression profiles of glycerol (cold temperature) and non-glycerol (warm temperature) accumulating hepatocytes (Chapter 2) and livers from intact smelt (Chapter 3). Briefly, glycerol synthesis can be initiated from both amino acids and carbohydrate; however carbohydrate appears to be the preferred source when it is readily available. In glycerol accumulating hepatocytes, levels of the hepatic glucose transporter (GLUT2) plummeted and transcript levels of a suite of genes (PEPCK, MDH2, AAT2, GDH and AQP9) associated with the mobilization of amino acids to fuel glycerol synthesis were all transiently higher. In contrast, in glycerol accumulating livers from intact smelt, glycerol synthesis was primarily fuelled by glycogen degradation with higher PGM and PFK (glycolysis) transcript levels. Whether initiated from amino acids or carbohydrate, there were common metabolic underpinnings. Increased PDK2 (an inhibitor of PDH) transcript levels would direct pyruvate derived from amino acids and / or DHAP derived from G6P to glycerol as opposed to oxidation via the citric acid cycle. Robust LIPL (triglyceride catabolism) transcript levels would provide free fatty acids that could be oxidized to fuel ATP synthesis. Increased cGPDH (glyceroneogenesis) transcript levels were not required for increased glycerol production, suggesting that regulation is more likely by post-translational modification. Finally, levels of a transcript potentially encoding glycerol-3-phosphatase, an enzyme not yet characterized in any vertebrate species, were transiently higher. These comparisons also led to the novel discoveries that increased G6Pase (glucose synthesis) and increased GS (glutamine synthesis) transcript levels were part of the low temperature response in smelt. Glucose may provide increased colligative protection against freezing; whereas glutamine could serve to store nitrogen released from amino acid catabolism in a non-toxic form and / or be used to synthesize urea via purine synthesis-uricolysis. Novel key aspects of cryoprotectant osmolyte (glycerol and urea) trafficking were elucidated by cloning and characterizing three aquaglyceroporin (GLP)-encoding genes from smelt at the gene and cDNA levels in Chapter 4. GLPs are integral membrane proteins that facilitate passive movement of water, glycerol and urea across cellular membranes. The highlight was the discovery that AQP10ba transcript levels always increase in posterior kidney only at low temperature. This AQP10b gene paralogue may have evolved to aid in the reabsorption of urea from the proximal tubule. This research has contributed significantly to a general understanding of the cold adaptation response in smelt, and more specifically to the development of a working scenario for the mechanisms involved in glycerol synthesis and trafficking in this species.
Resumo:
The problem of social diffusion has animated sociological thinking on topics ranging from the spread of an idea, an innovation or a disease, to the foundations of collective behavior and political polarization. While network diffusion has been a productive metaphor, the reality of diffusion processes is often muddier. Ideas and innovations diffuse differently from diseases, but, with a few exceptions, the diffusion of ideas and innovations has been modeled under the same assumptions as the diffusion of disease. In this dissertation, I develop two new diffusion models for "socially meaningful" contagions that address two of the most significant problems with current diffusion models: (1) that contagions can only spread along observed ties, and (2) that contagions do not change as they spread between people. I augment insights from these statistical and simulation models with an analysis of an empirical case of diffusion - the use of enterprise collaboration software in a large technology company. I focus the empirical study on when people abandon innovations, a crucial, and understudied aspect of the diffusion of innovations. Using timestamped posts, I analyze when people abandon software to a high degree of detail.
To address the first problem, I suggest a latent space diffusion model. Rather than treating ties as stable conduits for information, the latent space diffusion model treats ties as random draws from an underlying social space, and simulates diffusion over the social space. Theoretically, the social space model integrates both actor ties and attributes simultaneously in a single social plane, while incorporating schemas into diffusion processes gives an explicit form to the reciprocal influences that cognition and social environment have on each other. Practically, the latent space diffusion model produces statistically consistent diffusion estimates where using the network alone does not, and the diffusion with schemas model shows that introducing some cognitive processing into diffusion processes changes the rate and ultimate distribution of the spreading information. To address the second problem, I suggest a diffusion model with schemas. Rather than treating information as though it is spread without changes, the schema diffusion model allows people to modify information they receive to fit an underlying mental model of the information before they pass the information to others. Combining the latent space models with a schema notion for actors improves our models for social diffusion both theoretically and practically.
The empirical case study focuses on how the changing value of an innovation, introduced by the innovations' network externalities, influences when people abandon the innovation. In it, I find that people are least likely to abandon an innovation when other people in their neighborhood currently use the software as well. The effect is particularly pronounced for supervisors' current use and number of supervisory team members who currently use the software. This case study not only points to an important process in the diffusion of innovation, but also suggests a new approach -- computerized collaboration systems -- to collecting and analyzing data on organizational processes.
Resumo:
Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: 1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (ELUMO) via QSAR modelling and analysis; 2) to validate the models by using internal and external cross-validation techniques; 3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: 1) Linear or Multi-linear Regression (MLR); 2) Partial Least Squares (PLS); and 3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: 1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; 2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; 3) ELUMO are shown to correlate highly with the NCl for several classes of DBPs; and 4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.
Resumo:
The growing interest in quantifying the cultural and creative industries, visualize the economic contribution of activities related to culture demands first of all the construction of internationally comparable analysis frameworks. Currently there are three major bodies which address this issue and whose comparative study is the focus of this article: the UNESCO Framework for Cultural Statistics (FCS-2009), the European Framework for Cultural Statistics (ESSnet-Culture 2012) and the methodological resource of the “Convenio Andrés Bello” group for working with the Satellite Accounts on Culture in Ibero-America (CAB-2015). Cultural sector measurements provide the information necessary for correct planning of cultural policies which in turn leads to sustaining industries and promoting cultural diversity. The text identifies the existing differences in the three models and three levels of analysis, the sectors, the cultural activities and the criteria that each one uses in order to determine the distribution of the activities by sector. The end result leaves the impossibility of comparing cultural statistics of countries that implement different frameworks.