67 resultados para IMPOSED OPTIC FLOW
Resumo:
Phase separation dynamics in the presence of externally imposed stirring is studied. The stirring is assumed independent of the concentration and it is generated with a well-defined energy spectrum. The domain growth process is either favored or frozen depending on the intensity and correlation length of this advective flow. This behavior is explained by analytical arguments.
Resumo:
We have studied the structural changes that fatty acid monolayers in the Ov phase undergo when a simple shear flow is imposed. A strong coupling is revealed by the changes in domain structure that are observable using Brewster angle microscopy, suggesting the possibility of shear alignment. The dependence of the alignment on the molecular polar tilt proves that the mechanism is different than in nematic liquid crystals. We argue that the degenerate lattice symmetry lines of the underlying pseudohexagonal lattice align in the flow direction, and we explain the observed alignment angle using geometrical arguments.
Resumo:
In October 1998, Hurricane Mitch triggered numerous landslides (mainly debris flows) in Honduras and Nicaragua, resulting in a high death toll and in considerable damage to property. The potential application of relatively simple and affordable spatial prediction models for landslide hazard mapping in developing countries was studied. Our attention was focused on a region in NW Nicaragua, one of the most severely hit places during the Mitch event. A landslide map was obtained at 1:10 000 scale in a Geographic Information System (GIS) environment from the interpretation of aerial photographs and detailed field work. In this map the terrain failure zones were distinguished from the areas within the reach of the mobilized materials. A Digital Elevation Model (DEM) with 20 m×20 m of pixel size was also employed in the study area. A comparative analysis of the terrain failures caused by Hurricane Mitch and a selection of 4 terrain factors extracted from the DEM which, contributed to the terrain instability, was carried out. Land propensity to failure was determined with the aid of a bivariate analysis and GIS tools in a terrain failure susceptibility map. In order to estimate the areas that could be affected by the path or deposition of the mobilized materials, we considered the fact that under intense rainfall events debris flows tend to travel long distances following the maximum slope and merging with the drainage network. Using the TauDEM extension for ArcGIS software we generated automatically flow lines following the maximum slope in the DEM starting from the areas prone to failure in the terrain failure susceptibility map. The areas crossed by the flow lines from each terrain failure susceptibility class correspond to the runout susceptibility classes represented in a runout susceptibility map. The study of terrain failure and runout susceptibility enabled us to obtain a spatial prediction for landslides, which could contribute to landslide risk mitigation.
Resumo:
Work-related flow is defined as a sudden and enjoyable merging of action and awareness that represents a peak experience in the daily lives of workers. Employees" perceptions of challenge and skill and their subjective experiences in terms of enjoyment, interest and absorption were measured using the experience sampling method, yielding a total of 6981 observations from a sample of 60 employees. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes. According to the R2, AICc and BIC indexes, the nonlinear dynamical systems model (i.e. cusp catastrophe model) fit the data better than the linear and logistic regression models. Likewise, the cusp catastrophe model appears to be especially powerful for modelling those cases of high levels of flow. Overall, flow represents a nonequilibrium condition that combines continuous and abrupt changes across time. Research and intervention efforts concerned with this process should focus on the variable of challenge, which, according to our study, appears to play a key role in the abrupt changes observed in work-related flow.
Resumo:
The aims of this study are to consider the experience of flow from a nonlinear dynamics perspective. The processes and temporal nature of intrinsic motivation and flow, would suggest that flow experiences fluctuate over time in a dynamical fashion. Thus it can be argued that the potential for chaos is strong. The sample was composed of 20 employees (both full and part time) recruited from a number of different organizations and work backgrounds. The Experience Sampling Method (ESM) was used for data collection. Once obtained the temporal series, they were subjected to various analyses proper to the com- plexity theory (Visual Recurrence Analysis and Surrogate Data Analysis). Results showed that in 80% of the cases, flow presented a chaotic dynamic, in that, flow experiences delineated a complex dynamic whose patterns of change were not easy to predict. Implications of the study, its limitations and future research are discussed.
Resumo:
Flow cytometry has become a valuable tool in cell biology. By analyzing large number of cells individually using light-scatter and fluorescence measurements, this technique reveals both cellular characteristics and the levels of cellular components. Flow cytometry has been developed to rapidly enumerate cells and to distinguish among different cell stages and structures using multiple staining. In addition to high-speed multiparametric data acquisition, analysis and cell sorting, which allow other characteristics of individual cells to be studied, have increased the interest of researchers in this technique. This chapter gives an overview of the principles of flow cytometry and examples of the application ofthe technique.
Resumo:
Morphological transitions are analyzed for a radial multiparticle diffusion-limited aggregation process grown under a convective drift. The introduction of a tangential flow changes the morphology of the diffusion-limited structure, into multiarm structures, inclined opposite to the flow, whose limit consists of single arms, when decreasing density. The case of shear flow is also considered. The anisotropy of the patterns is characterized in terms of a tangential correlation function based analysis. Comparison between the simulation results and preliminary experimental results has been done.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flow computation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we de- velop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional mul- tilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrec- tional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimiza- tion search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow com- putation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00 am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann–Whitney test for non-normal continuous variables. Results: The median patients’ global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00 am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.
Resumo:
Postprint (published version)
Resumo:
The orchestration of collaborative learning processes in face-to-facephysical settings, such as classrooms, requires teachers to coordinate students indicating them who belong to each group, which collaboration areas areassigned to each group, and how they should distribute the resources or roles within the group. In this paper we present an Orchestration Signal system,composed of wearable Personal Signal devices and an Orchestration Signal manager. Teachers can configure color signals in the manager so that they are transmitted to the wearable devices to indicate different orchestration aspects.In particular, the paper describes how the system has been used to carry out a Jigsaw collaborative learning flow in a classroom where students received signals indicating which documents they should read, in which group they were and in which area of the classroom they were expected to collaborate. The evaluation results show that the proposed system facilitates a dynamic, visual and flexible orchestration.
Resumo:
Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00¿am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann¿Whitney test for non-normal continuous variables. Results:The median patients' global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00¿am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.
Resumo:
Substantial collective flow is observed in collisions between lead nuclei at Large Hadron Collider (LHC) as evidenced by the azimuthal correlations in the transverse momentum distributions of the produced particles. Our calculations indicate that the global v1-flow, which at RHIC peaked at negative rapidities (named third flow component or antiflow), now at LHC is going to turn toward forward rapidities (to the same side and direction as the projectile residue). Potentially this can provide a sensitive barometer to estimate the pressure and transport properties of the quark-gluon plasma. Our calculations also take into account the initial state center-of-mass rapidity fluctuations, and demonstrate that these are crucial for v1 simulations. In order to better study the transverse momentum flow dependence we suggest a new"symmetrized" vS1(pt) function, and we also propose a new method to disentangle global v1 flow from the contribution generated by the random fluctuations in the initial state. This will enhance the possibilities of studying the collective Global v1 flow both at the STAR Beam Energy Scan program and at LHC.
Resumo:
We analyze the diffusion of a Brownian particle in a fluid under stationary flow. By using the scheme of nonequilibrium thermodynamics in phase space, we obtain the Fokker-Planck equation that is compared with others derived from the kinetic theory and projector operator techniques. This equation exhibits violation of the fluctuation-dissipation theorem. By implementing the hydrodynamic regime described by the first moments of the nonequilibrium distribution, we find relaxation equations for the diffusion current and pressure tensor, allowing us to arrive at a complete description of the system in the inertial and diffusion regimes. The simplicity and generality of the method we propose makes it applicable to more complex situations, often encountered in problems of soft-condensed matter, in which not only one but more degrees of freedom are coupled to a nonequilibrium bath.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.