17 resultados para Study subject
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This communication is part of a larger teaching innovation project financed by the University ofBarcelona, whose objective is to develop and evaluate transversal competences of the UB, learningability and responsibility. The competence is divided into several sub-competencies being the ability toanalyze and synthesis the most intensely worked in the first year. The work presented here part fromthe results obtained in phase 1 and 2 previously implemented in other subjects (Mathematics andHistory) in the first year of the degree of Business Administration Degree. In these subjects’ previousexperiences there were deficiencies in the acquisition of learning skills by the students. The work inthe subject of Mathematics facilitated that students become aware of the deficit. The work on thesubject of History insisted on developing readings schemes and with the practical exercises wassought to go deeply in the development of this competence.The third phase presented here is developed in the framework of the second year degree, in the WorldEconomy subject. The objective of this phase is the development and evaluation of the same crosscompetence of the previous phases, from a practice that includes both, quantitative analysis andcritical reflection. Specifically the practice focuses on the study of the dynamic relationship betweeneconomic growth and the dynamics in the distribution of wealth. The activity design as well as theselection of materials to make it, has been directed to address gaps in the ability to analyze andsynthesize detected in the subjects of the first year in the previous phases of the project.The realization of the practical case is considered adequate methodology to improve the acquisition ofcompetence of the students, then it is also proposed how to evaluate the acquisition of suchcompetence. The practice is evaluated based on a rubric developed in the framework of the projectobjectives. Thus at the end of phase 3 we can analyze the process that have followed the students,detect where they have had major difficulties and identify those aspects of teaching that can help toimprove the acquisition of skills by the students. The interest of this phase resides in the possibility tovalue whether tracing of learning through competences, organized in a collaborative way, is a goodtool to develop the acquisition of these skills and facilitate their evaluation.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Background: There is evidence that exposure to passive smoking in general, and in babies in particular, is an important cause of morbimortality. Passive smoking is related to an increased risk of pediatric diseases such as sudden death syndrome, acute respiratory diseases, worsening of asthma, acute-chronic middle ear disease and slowing of lung growth.The objective of this article is to describe the BIBE study protocol. The BIBE study aims to determine the effectiveness of a brief intervention within the context of Primary Care, directed to mothers and fathers that smoke, in order to reduce the exposure of babies to passive smoking (ETS).Methods/DesignCluster randomized field trial (control and intervention group), multicentric and open. Subject: Fathers and/or mothers who are smokers and their babies (under 18 months) that attend pediatric services in Primary Care in Catalonia.The measurements will be taken at three points in time, in each of the fathers and/or mothers who respond to a questionnaire regarding their baby's clinical background and characteristics of the baby's exposure, together with variables related to the parents' tobacco consumption. A hair sample of the baby will be taken at the beginning of the study and at six months after the initial visit (biological determination of nicotine). The intervention group will apply a brief intervention in passive smoking after specific training and the control group will apply the habitual care.Discussion: Exposure to ETS is an avoidable factor related to infant morbimortality. Interventions to reduce exposure to ETS in babies are potentially beneficial for their health. The BIBE study evaluates an intervention to reduce exposure to ETS that takes advantage of pediatric visits. Interventions in the form of advice, conducted by pediatric professionals, are an excellent opportunity for prevention and protection of infants against the harmful effects of ETS.
Resumo:
Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
This paper explores the possibility of using data from social bookmarking services to measure the use of information by academic researchers. Social bookmarking data can be used to augment participative methods (e.g. interviews and surveys) and other, non-participative methods (e.g. citation analysis and transaction logs) to measure the use of scholarly information. We use BibSonomy, a free resource-sharing system, as a case study. Results show that published journal articles are by far the most popular type of source bookmarked, followed by conference proceedings and books. Commercial journal publisher platforms are the most popular type of information resource bookmarked, followed by websites, records in databases and digital repositories. Usage of open access information resources is low in comparison with toll access journals. In the case of open access repositories, there is a marked preference for the use of subject-based repositories over institutional repositories. The results are consistent with those observed in related studies based on surveys and citation analysis, confirming the possible use of bookmarking data in studies of information behaviour in academic settings. The main advantages of using social bookmarking data are that is an unobtrusive approach, it captures the reading habits of researchers who are not necessarily authors, and data are readily available. The main limitation is that a significant amount of human resources is required in cleaning and standardizing the data.
Resumo:
Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.
Resumo:
This contribution presents the LRCW.net a website with a virtual laboratory intranet devoted to the study of coarse and cooking wares in the late Antique Mediterranean. It is designed as a public website with a virtual laboratory intranet. There, all institutions and researchers interested in the subject can work together towards a specific purpose such as the creation of an on-line ‘encyclopedia’ for these categories of ceramics. The LRCW.net website and the associated virtual laboratory are just a small part of a wider initiative that aims to create an on-line Encyclopedia for Ancient Ceramics in the Mediterranean.
Resumo:
El concepte d'alfabetització digital ha evolucionat per diverses vies al llarg del temps pel que fa a l'enfocament teòric emprat per a investigar les seves implicacions en l'estudi de la divisió digital de gènere en diversos contextos de la vida real. L'objectiu principal d'aquest document consisteix a fer servir un enfocament interdisciplinari per a analitzar algunes de les llacunes teòriques i empíriques presents en l'estudi de la divisió digital de gènere. S'analitzen alguns dels estudis empírics existents sobre aquesta qüestió i es proposen futures línies de recerca, amb l'objectiu de cobrir algunes de les llacunes en la recerca relacionada amb les implicacions de l'alfabetització digital en l'anàlisi de la divisió digital de gènere.
Resumo:
This study analyzed stability and consistency of coping among adolescents. The objectives were twofold: a) to analyze temporal stability and cross-situational consistency of coping responses after a 17- month interval, taking into account gender, age and type of stressor. b) To analyze the relative weight of contextual versus dispositional factors in predicting future coping. A cohort of 341 adolescents (51% girls and 49% boys aged between 12 and 16) were assessed twice by means of the Coping Responses Inventory - Youth. The results indicated that the coping responses were quite stable over time at the group level, but with important within-subject differences. Girls showed slightly more stability than boys. Among the girls, Avoidance coping showed as much stability as consistency and Approach coping showed more stability than consistency. Among the boys, Avoidance coping showed more stability than consistency, and Approach coping showed both low stability and low consistency. Among the boys, the coping used at Time 1 barely predicted that used at Time 2; in contrast, among the girls, the type of coping used in the past, especially Avoidance coping, predicted the coping that would be used in the future.
Resumo:
Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.
Resumo:
This study extends the standard econometric treatment of appellate court outcomes by 1) considering the role of decision-maker effort and case complexity, and 2) adopting a multi-categorical selection process of appealed cases. We find evidence of appellate courts being affected by both the effort made by first-stage decision makers and case complexity. This illustrates the value of widening the narrowly defined focus on heterogeneity in individual-specific preferences that characterises many applied studies on legal decision-making. Further, the majority of appealed cases represent non-random sub-samples and the multi-categorical selection process appears to offer advantages over the more commonly used dichotomous selection models.
Resumo:
Genome-wide linkage studies have identified the 9q22 chromosomal region as linked with colorectal cancer (CRC) predisposition. A candidate gene in this region is transforming growth factor beta receptor 1 (TGFBR1). Investigation of TGFBR1 has focused on the common genetic variant rs11466445, a short exonic deletion of nine base pairs which results in truncation of a stretch of nine alanine residues to six alanine residues in the gene product. While the six alanine (*6A) allele has been reported to be associated with increased risk of CRC in some population based study groups this association remains the subject of robust debate. To date, reports have been limited to population-based case-control association studies, or case-control studies of CRC families selecting one affected individual per family. No study has yet taken advantage of all the genetic information provided by multiplex CRC families. Methods: We have tested for an association between rs11466445 and risk of CRC using several family-based statistical tests in a new study group comprising members of non-syndromic high risk CRC families sourced from three familial cancer centres, two in Australia and one in Spain. Results: We report a finding of a nominally significant result using the pedigree-based association test approach (PBAT; p = 0.028), while other family-based tests were non-significant, but with a p-value < 0.10 in each instance. These other tests included the Generalised Disequilibrium Test (GDT; p = 0.085), parent of origin GDT Generalised Disequilibrium Test (GDT-PO; p = 0.081) and empirical Family-Based Association Test (FBAT; p = 0.096, additive model). Related-person case-control testing using the 'More Powerful' Quasi-Likelihood Score Test did not provide any evidence for association (M-QL5; p = 0.41). Conclusions: After conservatively taking into account considerations for multiple hypothesis testing, we find little evidence for an association between the TGFBR1*6A allele and CRC risk in these families. The weak support for an increase in risk in CRC predisposed families is in agreement with recent meta-analyses of case-control studies, which estimate only a modest increase in sporadic CRC risk among 6*A allele carriers.
Resumo:
Objectives: To determine the incidence, severity and duration of lingual tactile and gustatory function impairments after lower third molar removal. Study Design: Prospective cohort study with intra-subject measures of 16 patients undergoing lower third molar extractions. Sensibility and gustatory functions were evaluated in each subject preoperatively, one week and one month after the extraction, using Semmes-Weinstein monofilaments and 5 different concentrations of NaCl, respectively. Additionally, all patients filled a questionnaire to assess subjective perceptions. Results: Although patients did not perceive any sensibility impairments, a statistically significant decrease was detected when Semmes-Weinstein monofilaments. This alteration was present at one week after the surgical procedure and fully recovered one month after the extraction. There were no variations regarding the gustatory function. Conclusions: Lower third molar removal under local anesthesia may cause light lingual sensibility impairment. Most of these alterations remain undetected to patients. These lingual nerve injuries are present one week after the extraction and recover one month after surgery. The taste seems to remain unaffected after these procedures.
Resumo:
This paper provides a map of the scientific productivity of authors affiliated to a Spanish institution and who have addressed one of the most important current topics in schizophrenia: The study of cognitive performance. A search of the Web of Science yielded 125 articles that met the inclusion criteria. In order to provide a comprehensive overview of scientific productivity, we examine several bibliometric indicators, concerning both productivity and impact or visibility. The analysis also focuses on qualitative aspects of key theoretical importance, such as the kinds of cognitive functions that are most often assessed and the tests most widely used to evaluate them in clinical practice. The study shows that interest in the subject of cognitive function in schizophrenia has increased considerably in Spain since the beginning of this century. The results also highlight the need to standardize the type of tests to be used in the cognitive assessment of patients with schizophrenia.