30 resultados para Speed and torque observers
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The speed and width of front solutions to reaction-dispersal models are analyzed both analytically and numerically. We perform our analysis for Laplace and Gaussian distribution kernels, both for delayed and nondelayed models. The results are discussed in terms of the characteristic parameters of the models
Resumo:
Particle fluxes (including major components and grain size), and oceanographic parameters (near-bottom water temperature, current speed and suspended sediment concentration) were measured along the Cap de Creus submarine canyon in the Gulf of Lions (GoL; NW Mediterranean Sea) during two consecutive winter-spring periods (2009 2010 and 2010 2011). The comparison of data obtained with the measurements of meteorological and hydrological parameters (wind speed, turbulent heat flux, river discharge) have shown the important role of atmospheric forcings in transporting particulate matter through the submarine canyon and towards the deep sea. Indeed, atmospheric forcing during 2009 2010 and 2010 2011 winter months showed differences in both intensity and persistence that led to distinct oceanographic responses. Persistent dry northern winds caused strong heat losses (14.2 × 103 W m−2) in winter 2009 2010 that triggered a pronounced sea surface cooling compared to winter 2010 2011 (1.6 × 103 W m−2 lower). As a consequence, a large volume of dense shelf water formed in winter 2009 2010, which cascaded at high speed (up to ∼1 m s−1) down Cap de Creus Canyon as measured by a current-meter in the head of the canyon. The lower heat losses recorded in winter 2010 2011, together with an increased river discharge, resulted in lowered density waters over the shelf, thus preventing the formation and downslope transport of dense shelf water. High total mass fluxes (up to 84.9 g m−2 d−1) recorded in winter-spring 2009 2010 indicate that dense shelf water cascading resuspended and transported sediments at least down to the middle canyon. Sediment fluxes were lower (28.9 g m−2 d−1) under the quieter conditions of winter 2010 2011. The dominance of the lithogenic fraction in mass fluxes during the two winter-spring periods points to a resuspension origin for most of the particles transported down canyon. The variability in organic matter and opal contents relates to seasonally controlled inputs associated with the plankton spring bloom during March and April of both years.
Resumo:
This paper proposes a very fast method for blindly initial- izing a nonlinear mapping which transforms a sum of random variables. The method provides a surprisingly good approximation even when the basic assumption is not fully satis¯ed. The method can been used success- fully for initializing nonlinearity in post-nonlinear mixtures or in Wiener system inversion, for improving algorithm speed and convergence.
Resumo:
The disintegration of recovered paper is the first operation in the preparation of recycled pulp. It is known that the defibering process follows a first order kinetics from which it is possible to obtain the disintegration kinetic constant (KD) by means of different ways. The disintegration constant can be obtained from the Somerville index results (%lsv and from the dissipated energy per volume unit (Ss). The %slv is related to the quantity of non-defibrated paper, as a measure of the non-disintegrated fiber residual (percentage of flakes), which is expressed in disintegration time units. In this work, disintegration kinetics from recycled coated paper has been evaluated, working at 20 revise rotor speed and for different fiber consistency (6, 8, 10, 12 and 14%). The results showed that the values of experimental disintegration kinetic constant, Ko, through the analysis of Somerville index, as function of time. Increased, the disintegration time was drastically reduced. The calculation of the disintegration kinetic constant (modelled Ko), extracted from the Rayleigh’s dissipation function, showed a good correlation with the experimental values using the evolution of the Somerville index or with the dissipated energy
Resumo:
The disintegration of recovered paper is the first operation in the preparation of recycled pulp. It is known that the defibering process follows a first order kinetics from which it is possible to obtain the disintegration kinetic constant (KD) by means of different ways. The disintegration constant can be obtained from the Somerville index results (%lsv and from the dissipated energy per volume unit (Ss). The %slv is related to the quantity of non-defibrated paper, as a measure of the non-disintegrated fiber residual (percentage of flakes), which is expressed in disintegration time units. In this work, disintegration kinetics from recycled coated paper has been evaluated, working at 20 revise rotor speed and for different fiber consistency (6, 8, 10, 12 and 14%). The results showed that the values of experimental disintegration kinetic constant, Ko, through the analysis of Somerville index, as function of time. Increased, the disintegration time was drastically reduced. The calculation of the disintegration kinetic constant (modelled Ko), extracted from the Rayleigh’s dissipation function, showed a good correlation with the experimental values using the evolution of the Somerville index or with the dissipated energy
Resumo:
We extend a previous model of the Neolithic transition in Europe [J. Fort and V. Méndez, Phys. Rev. Lett. 82, 867 (1999)] by taking two effects into account: (i) we do not use the diffusion approximation (which corresponds to second-order Taylor expansions), and (ii) we take proper care of the fact that parents do not migrate away from their children (we refer to this as a time-order effect, in the sense that it implies that children grow up with their parents, before they become adults and can survive and migrate). We also derive a time-ordered, second-order equation, which we call the sequential reaction-diffusion equation, and use it to show that effect (ii) is the most important one, and that both of them should in general be taken into account to derive accurate results. As an example, we consider the Neolithic transition: the model predictions agree with the observed front speed, and the corrections relative to previous models are important (up to 70%)
Resumo:
BACKGROUND: Previous cross-sectional studies report that cognitive impairment is associated with poor psychosocial functioning in euthymic bipolar patients. There is a lack of long-term studies to determine the course of cognitive impairment and its impact on functional outcome. Method A total of 54 subjects were assessed at baseline and 6 years later; 28 had DSM-IV TR bipolar I or II disorder (recruited, at baseline, from a Lithium Clinic Program) and 26 were healthy matched controls. They were all assessed with a cognitive battery tapping into the main cognitive domains (executive function, attention, processing speed, verbal memory and visual memory) twice over a 6-year follow-up period. All patients were euthymic (Hamilton Rating Scale for Depression score lower than 8 and Young mania rating scale score lower than 6) for at least 3 months before both evaluations. At the end of follow-up, psychosocial functioning was also evaluated by means of the Functioning Assessment Short Test. RESULTS: Repeated-measures multivariate analysis of covariance showed that there were main effects of group in the executive domain, in the inhibition domain, in the processing speed domain, and in the verbal memory domain (p<0.04). Among the clinical factors, only longer illness duration was significantly related to slow processing (p=0.01), whereas strong relationships were observed between impoverished cognition along time and poorer psychosocial functioning (p<0.05). CONCLUSIONS: Executive functioning, inhibition, processing speed and verbal memory were impaired in euthymic bipolar out-patients. Although cognitive deficits remained stable on average throughout the follow-up, they had enduring negative effects on psychosocial adaptation of patients.
Resumo:
Today, most software development teams use free and open source software (FOSS) components, because it increases the speed and the quality of the development. Many open source components are the de facto standard of their category. However, FOSS has licensing restrictions, and corporate organizations usually maintain a list of allowed and forbidden licenses. But how do you enforce this policy? How can you make sure that ALL files in your source depot, either belong to you, or fit your licensing policy? A first, preventive approach is to train and increase the awareness of the development team to these licensing issues. Depending on the size of the team, it may be costly but necessary. However, this does not ensure that a single individual will not commit a forbidden icon or library, and jeopardize the legal status of the whole release... if not the company, since software is becoming more and more a critical asset. Another approach is to verify what is included in the source repository, and check whether it belongs to the open-source world. This can be done on-the-fly, whenever a new file is added into the source depot. It can also be part of the release process, as a verification step before publishing the release. In both cases, there are some tools and databases to automate the detection process. We will present the various options regarding FOSS detection, how this process can be integrated in the "software factory", and how the results can be displayed in a usable and efficient way.
Resumo:
The disintegration of recovered paper is the first operation in the preparation of recycled pulp. It is known that the defibering process follows a first order kinetics from which it is possible to obtain the disintegration kinetic constant (KD) by means of different ways. The disintegration constant can be obtained from the Somerville index results (%lsv and from the dissipated energy per volume unit (Ss). The %slv is related to the quantity of non-defibrated paper, as a measure of the non-disintegrated fiber residual (percentage of flakes), which is expressed in disintegration time units. In this work, disintegration kinetics from recycled coated paper has been evaluated, working at 20 revise rotor speed and for different fiber consistency (6, 8, 10, 12 and 14%). The results showed that the values of experimental disintegration kinetic constant, Ko, through the analysis of Somerville index, as function of time. Increased, the disintegration time was drastically reduced. The calculation of the disintegration kinetic constant (modelled Ko), extracted from the Rayleigh’s dissipation function, showed a good correlation with the experimental values using the evolution of the Somerville index or with the dissipated energy
Resumo:
Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well.
Resumo:
We examine the phenomenon of hydrodynamic-induced cooperativity for pairs of flagellated micro-organism swimmers, of which spermatozoa cells are an example. We consider semiflexible swimmers, where inextensible filaments are driven by an internal intrinsic force and torque-free mechanism (intrinsic swimmers). The velocity gain for swimming cooperatively, which depends on both the geometry and the driving, develops as a result of the near-field coupling of bending and hydrodynamic stresses. We identify the regimes where hydrodynamic cooperativity is advantageous and quantify the change in efficiency. When the filaments' axes are parallel, hydrodynamic interaction induces a directional instability that causes semiflexible swimmers that profit from swimming together to move apart from each other. Biologically, this implies that flagella need to select different synchronized collective states and to compensate for directional instabilities (e.g., by binding) in order to profit from swimming together. By analyzing the cooperative motion of pairs of externally actuated filaments, we assess the impact that stress distribution along the filaments has on their collective displacements.
Resumo:
The expansion of broadband speed and coverage over IP technology, which extend over transport and terminal access networks, has increased the demand for applications and content which by being provided over it, uniformly give rise to convergence. These shifts in technologies and enterprise business models are giving rise to the necessity for changing the perspective and scope of the Universal Service and of the regulation frameworks, with this last one based in the same principles as always but varying its application. Several aspects require special and renewed attention, such as the definition of relevant markets and dominant operators, the role of packages, interconnection of IP networks, network neutrality, the use of the spectrum with a vision of value for the citizenship, the application of the competition framework, new forms of licensing, treatment of the risk in the networks, changes in the regulatory authorities, amongst others. These matters are treated from the perspective of the actual trends in the world and its conceptual justification.
Resumo:
In this study we critically review the internal procedures of the accounting community for generating and disseminating knowledge. We contend that academic journals on accounting research are scarce, publish few articles and apply high rejection rates, and the review process is lengthy and expensive. Additionally, an academic elite has unparalleled predominance in comparison to other business disciplines, reflected in an unusual share of published articles with authors affiliated to a small number of academic institutions, and the predominance of certain topics and methodologies. The discipline does not allow the collaborative, iterative and flexible features of innovative knowledge communities. The discipline¿s internal procedures favour restriction, control, slowness, and expiration, rather than participation, speed and renewal. They are ill suited for advancing knowledge and bode badly for successful research. As a result, accounting academics present low research performance and the discipline is facing steady decline. More importantly, the discipline is handicapped in producing innovative knowledge able to contribute to critical research and long term social well-being. We also focus on the Spanish institutional situation, arguing that Spanish requirements for reaching tenured positions are difficult for accountants to meet. We highlight the need to raise awareness of the problem and change the procedures.
Resumo:
The multidimensional process of physical, psychological, and social change produced by population ageing affects not only the quality of life of elderly people but also of our societies. Some dimensions of population ageing grow and expand over time (e.g. knowledge of the world events, or experience in particular situations), while others decline (e.g. reaction time, physical and psychological strength, or other functional abilities like reduced speed and tiredness). Information and Communication Technologies (ICTs) can help elderly to overcome possible limitations due to ageing. As a particular case, biometrics can allow the development of new algorithms for early detection of cognitive impairments, by processing continuous speech, handwriting or other challenged abilities. Among all possibilities, digital applications (Apps) for mobile phones or tablets can allow the dissemination of such tools. In this article, after presenting and discussing the process of population ageing and its social implications, we explore how ICTs through different Apps can lead to new solutions for facing this major demographic challenge.
Resumo:
BACKGROUND AND PURPOSE: The high variability of CSF volumes partly explains the inconsistency of anesthetic effects, but may also be due to image analysis itself. In this study, criteria for threshold selection are anatomically defined. METHODS: T2 MR images (n = 7 cases) were analyzed using 3-dimentional software. Maximal-minimal thresholds were selected in standardized blocks of 50 slices of the dural sac ending caudally at the L5-S1 intervertebral space (caudal blocks) and middle L3 (rostral blocks). Maximal CSF thresholds: threshold value was increased until at least one voxel in a CSF area appeared unlabeled and decreased until that voxel was labeled again: this final threshold was selected. Minimal root thresholds: thresholds values that selected cauda equina root area but not adjacent gray voxels in the CSF-root interface were chosen. RESULTS: Significant differences were found between caudal and rostral thresholds. No significant differences were found between expert and nonexpert observers. Average max/min thresholds were around 1.30 but max/min CSF volumes were around 1.15. Great interindividual CSF volume variability was detected (max/min volumes 1.6-2.7). CONCLUSIONS: The estimation of a close range of CSF volumes which probably contains the real CSF volume value can be standardized and calculated prior to certain intrathecal procedures