125 resultados para continuous-time models
Resumo:
The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.
Resumo:
Work-related flow is defined as a sudden and enjoyable merging of action and awareness that represents a peak experience in the daily lives of workers. Employees" perceptions of challenge and skill and their subjective experiences in terms of enjoyment, interest and absorption were measured using the experience sampling method, yielding a total of 6981 observations from a sample of 60 employees. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes. According to the R2, AICc and BIC indexes, the nonlinear dynamical systems model (i.e. cusp catastrophe model) fit the data better than the linear and logistic regression models. Likewise, the cusp catastrophe model appears to be especially powerful for modelling those cases of high levels of flow. Overall, flow represents a nonequilibrium condition that combines continuous and abrupt changes across time. Research and intervention efforts concerned with this process should focus on the variable of challenge, which, according to our study, appears to play a key role in the abrupt changes observed in work-related flow.
Resumo:
The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.
Resumo:
The kinetics and microstructure of solid-phase crystallization under continuous heating conditions and random distribution of nuclei are analyzed. An Arrhenius temperature dependence is assumed for both nucleation and growth rates. Under these circumstances, the system has a scaling law such that the behavior of the scaled system is independent of the heating rate. Hence, the kinetics and microstructure obtained at different heating rates differ only in time and length scaling factors. Concerning the kinetics, it is shown that the extended volume evolves with time according to αex = [exp(κCt′)]m+1, where t′ is the dimensionless time. This scaled solution not only represents a significant simplification of the system description, it also provides new tools for its analysis. For instance, it has been possible to find an analytical dependence of the final average grain size on kinetic parameters. Concerning the microstructure, the existence of a length scaling factor has allowed the grain-size distribution to be numerically calculated as a function of the kinetic parameters
Space Competition and Time Delays in Human Range Expansions. Application to the Neolithic Transition
Resumo:
Space competition effects are well-known in many microbiological and ecological systems. Here we analyze such an effectin human populations. The Neolithic transition (change from foraging to farming) was mainly the outcome of a demographic process that spread gradually throughout Europe from the Near East. In Northern Europe, archaeological data show a slowdown on the Neolithic rate of spread that can be related to a high indigenous (Mesolithic) population density hindering the advance as a result of the space competition between the two populations. We measure this slowdown from a database of 902 Early Neolithic sites and develop a time-delayed reaction-diffusion model with space competition between Neolithic and Mesolithic populations, to predict the observed speeds. The comparison of the predicted speed with the observations and with a previous non-delayed model show that both effects, the time delay effect due to the generation lag and the space competition between populations, are crucial in order to understand the observations
Resumo:
Expression of water soluble proteins of fresh pork Longissimus thoracis from 4 pure breed pigs (Duroc, Large White, Landrace, and Piétrain) was studied to identify candidate protein markers for meat quality. Surface-enhanced laser desorption/ionisation time-of-flight mass spectrometry (SELDI-TOF-MS) was used to obtain the soluble protein profiles of Longissimus thoracis muscles. The pure breeds showed differences among the studied meat quality traits (pHu, drip loss, androstenone, marbling, intramuscular fat, texture, and moisture), but no significant differences were detected in sensory analysis. Associations between protein peaks obtained with SELDI-TOF-MS and meat quality traits, mainly water holding capacity, texture and skatole were observed. Of these peaks, a total of 10 peaks from CM10 array and 6 peaks from Q10 array were candidate soluble protein markers for pork loin quality. The developed models explained a limited proportion of the variability, however they point out interesting relationships between protein expression and meat quality
Resumo:
Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html
Resumo:
Heavy-ion reactions and other collective dynamical processes are frequently described by different theoretical approaches for the different stages of the process, like initial equilibration stage, intermediate locally equilibrated fluid dynamical stage, and final freeze-out stage. For the last stage, the best known is the Cooper-Frye description used to generate the phase space distribution of emitted, noninteracting particles from a fluid dynamical expansion or explosion, assuming a final ideal gas distribution, or (less frequently) an out-of-equilibrium distribution. In this work we do not want to replace the Cooper-Frye description, but rather clarify the ways of using it and how to choose the parameters of the distribution and, eventually, how to choose the form of the phase space distribution used in the Cooper-Frye formula. Moreover, the Cooper-Frye formula is used in connection with the freeze-out problem, while the discussion of transition between different stages of the collision is applicable to other transitions also. More recently, hadronization and molecular dynamics models have been matched to the end of a fluid dynamical stage to describe hadronization and freeze-out. The stages of the model description can be matched to each other on space-time hypersurfaces (just like through the frequently used freeze-out hypersurface). This work presents a generalized description of how to match the stages of the description of a reaction to each other, extending the methodology used at freeze-out, in simple covariant form which is easily applicable in its simplest version for most applications.
Resumo:
In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.
Resumo:
The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.
Resumo:
In this correspondence, we propose applying the hiddenMarkov models (HMM) theory to the problem of blind channel estimationand data detection. The Baum–Welch (BW) algorithm, which is able toestimate all the parameters of the model, is enriched by introducingsome linear constraints emerging from a linear FIR hypothesis on thechannel. Additionally, a version of the algorithm that is suitable for timevaryingchannels is also presented. Performance is analyzed in a GSMenvironment using standard test channels and is found to be close to thatobtained with a nonblind receiver.
Resumo:
The -function and the -function are phenomenological models that are widely used in the context of timing interceptive actions and collision avoidance, respectively. Both models were previously considered to be unrelated to each other: is a decreasing function that provides an estimation of time-to-contact (ttc) in the early phase of an object approach; in contrast, has a maximum before ttc. Furthermore, it is not clear how both functions could be implemented at the neuronal level in a biophysically plausible fashion. Here we propose a new framework the corrected modified Tau function capable of predicting both -type ("") and -type ("") responses. The outstanding property of our new framework is its resilience to noise. We show that can be derived from a firing rate equation, and, as , serves to describe the response curves of collision sensitive neurons. Furthermore, we show that predicts the psychophysical performance of subjects determining ttc. Our new framework is thus validated successfully against published and novel experimental data. Within the framework, links between -type and -type neurons are established. Therefore, it could possibly serve as a model for explaining the co-occurrence of such neurons in the brain.
Resumo:
Background: During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia.
Resumo:
This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.
Mueller matrix microscope with a dual continuous rotating compensator setup and digital demodulation
Resumo:
In this paper we describe a new Mueller matrix (MM) microscope that generalizes and makes quantitative the polarized light microscopy technique. In this instrument all the elements of the MU are simultaneously determined from the analysis in the frequency domain of the time-dependent intensity of the light beam at every pixel of the camera. The variations in intensity are created by the two compensators continuously rotating at different angular frequencies. A typical measurement is completed in a little over one minute and it can be applied to any visible wavelength. Some examples are presented to demonstrate the capabilities of the instrument.