996 resultados para Flow-oriented spirometer


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today’s material flow systems for mass customization or dynamic productions are usually realized with manual transportation systems. However new concepts in the domain of material flow and device control like function-oriented modularization and intelligent multi-agent-systems offer the possibility to employ changeable and automated material flow systems in dynamic production structures. These systems need the ability to react on unplanned and unexpected events autonomously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The cerebral network that is active during rest and is deactivated during goal-oriented activity is called the default mode network (DMN). It appears to be involved in self-referential mental activity. Atypical functional connectivity in the DMN has been observed in schizophrenia. One hypothesis suggests that pathologically increased DMN connectivity in schizophrenia is linked with a main symptom of psychosis, namely, misattribution of thoughts. Methods: A resting-state pseudocontinuous arterial spin labeling (ASL) study was conducted to measure absolute cerebral blood flow (CBF) in 34 schizophrenia patients and 27 healthy controls. Using independent component analysis (ICA), the DMN was extracted from ASL data. Mean CBF and DMN connectivity were compared between groups using a 2-sample t test. Results: Schizophrenia patients showed decreased mean CBF in the frontal and temporal regions (P < .001). ICA demonstrated significantly increased DMN connectivity in the precuneus (x/y/z = -16/-64/38) in patients than in controls (P < .001). CBF was not elevated in the respective regions. DMN connectivity in the precuneus was significantly correlated with the Positive and Negative Syndrome Scale scores (P < .01). Conclusions: In schizophrenia patients, the posterior hub-which is considered the strongest part of the DMN-showed increased DMN connectivity. We hypothesize that this increase hinders the deactivation of the DMN and, thus, the translation of cognitive processes from an internal to an external focus. This might explain symptoms related to defective self-monitoring, such as auditory verbal hallucinations or ego disturbances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a three-dimensional vertically-oriented fault zone, we consider the coupled effects of fluid flow, heat transfer and reactive mass transport, to investigate the patterns of fluid flow, temperature distribution, mineral alteration and chemically induced porosity changes. We show, analytically and numerically, that finger-like convection patterns can arise in a vertically-oriented fault zone. The onset and patterns of convective fluid flow are controlled by the Rayleigh number which is a function of the thermal properties of the fluid and the rock, the vertical temperature gradient, and the height and the permeability of the fault zone. Vigorous fluid flow causes low temperature gradients over a large region of the fault zone. In such a case, flow across lithological interfaces becomes the most important mechanism for the formation of sharp chemical reaction fronts. The degree of rock buffering, the extent and intensity of alteration, the alteration mineralogy and in some cases the formation of ore deposits are controlled by the magnitude of the flow velocity across these compositional interfaces in the rock. This indicates that alteration patterns along compositional boundaries in the rock may provide some insights into the convection pattern. The advective mass and heat exchanges between the fault zone and the wallrock depend on the permeability contrast between the fault zone and the wallrock. A high permeability contrast promotes focussed convective flow within the fault zone and diffusive exchange of heat and chemical reactants between the fault zone and the wallrock. However, a more gradual permeability change may lead to a regional-scale convective flow system where the flow pattern in the fault affects large-scale fluid flow, mass transport and chemical alteration in the wallrocks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Static analyses of object-oriented programs usually rely on intermediate representations that respect the original semantics while having a more uniform and basic syntax. Most of the work involving object-oriented languages and abstract interpretation usually omits the description of that language or just refers to the Control Flow Graph(CFG) it represents. However, this lack of formalization on one hand results in an absence of assurances regarding the correctness of the transformation and on the other it typically strongly couples the analysis to the source language. In this work we present a framework for analysis of object-oriented languages in which in a first phase we transform the input program into a representation based on Horn clauses. This allows on one hand proving the transformation correct attending to a simple condition and on the other being able to apply an existing analyzer for (constraint) logic programming to automatically derive a safe approximation of the semantics of the original program. The approach is flexible in the sense that the first phase decouples the analyzer from most languagedependent features, and correct because the set of Horn clauses returned by the transformation phase safely approximates the standard semantics of the input program. The resulting analysis is also reasonably scalable due to the use of mature, modular (C)LP-based analyzers. The overall approach allows us to report results for medium-sized programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract interpreters rely on the existence of a nxpoint algorithm that calculates a least upper bound approximation of the semantics of the program. Usually, that algorithm is described in terms of the particular language in study and therefore it is not directly applicable to programs written in a different source language. In this paper we introduce a generic, block-based, and uniform representation of the program control flow graph and a language-independent nxpoint algorithm that can be applied to a variety of languages and, in particular, Java. Two major characteristics of our approach are accuracy (obtained through a topdown, context sensitive approach) and reasonable efficiency (achieved by means of memoization and dependency tracking techniques). We have also implemented the proposed framework and show some initial experimental results for standard benchmarks, which further support the feasibility of the solution adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the principal results of the doctoral thesis “Semantic-oriented Architecture and Models for Personalized and Adaptive Access to the Knowledge in Multimedia Digital Library” by Desislava Ivanova Paneva-Marinova (Institute of Mathematics and Informatics), successfully defended before the Specialised Academic Council for Informatics and Mathematical Modelling on 27 October, 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (∼1 km2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999–2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI,http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999–2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory during the dry season where surface flow in the area is confined to the Taylor River channel. The model also provided guidance on the importance of capturing the overland flow component, which enters the area as sheet flow during the rainy season. Overall, the modeling approach is suitable to reach better understanding of the water budget in the mangrove region. However, more detailed field data is needed to ascertain model predictions by further calibrating overland flow parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three sites were cored on the landward slope of the Nankai margin of southwest Japan during Leg 190 of the Ocean Drilling Program. Sites 1175 and 1176 are located in a trench-slope basin that was constructed during the early Pleistocene (~1 Ma) by frontal offscraping of coarse-grained trench-wedge deposits. Rapid uplift elevated the substrate above the calcite compensation depth and rerouted a transverse canyon-channel system that had delivered most of the trench sediment during the late Pliocene (1.06-1.95 Ma). The basin's depth is now ~3000 to 3020 m below sea level. Clay-sized detritus (<2 µm) did not change significantly in composition during the transition from trench-floor to slope-basin environment. Relative mineral abundances for the two slope-basin sites average 36-37 wt% illite, 25 wt% smectite, 22-24 wt% chlorite, and 15-16 wt% quartz. Site 1178 is located higher up the landward slope at a water depth of 1741 m, ~70 km from the present-day deformation front. There is a pronounced discontinuity ~200 m below seafloor between muddy slope-apron deposits (Quaternary-late Miocene) and sandier trench-wedge deposits (late Miocene; 6.8-9.63 Ma). Clay minerals change downsection from an illite-chlorite assemblage (similar to Sites 1175 and 1176) to one that contains substantial amounts of smectite (average = 45 wt% of the clay-sized fraction; maximum = 76 wt%). Mixing in the water column homogenizes fine-grained suspended sediment eroded from the Izu-Bonin volcanic arc, the Izu-Honshu collision zone, and the Outer Zone of Kyushu and Shikoku, but the spatial balance among those contributors has shifted through time. Closure of the Central America Seaway at ~3 Ma was particularly important because it triggered intensification of the Kuroshio Current. With stronger and deeper flow of surface water toward the northeast, the flux of smectite from the Izu-Bonin volcanic arc was dampened and more detrital illite and chlorite were transported into the Shikoku-Nankai system from the Outer Zone of Japan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of Internet-of-Services will be hampered by heterogeneous Internet-of-Things infrastructures, such as inconsistency in communicating with participating objects, connectivity between them, topology definition & data transfer, access via cloud computing for data storage etc. Our proposed solutions are applicable to a random topology scenario that allow establishing of multi-operational sensor networks out of single networks and/or single service networks with the participation of multiple networks; thus allowing virtual links to be created and resources to be shared. The designed layers are context-aware, application-oriented, and capable of representing physical objects to a management system, along with discovery of services. The reliability issue is addressed by deploying IETF supported IEEE 802.15.4 network model for low-rate wireless personal networks. Flow- sensor succeeded better results in comparison to the typical - sensor from reachability, throughput, energy consumption and diversity gain viewpoint and through allowing the multicast groups into maximum number, performances can be improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study mainly aims to provide an inter-industry analysis through the subdivision of various industries in flow of funds (FOF) accounts. Combined with the Financial Statement Analysis data from 2004 and 2005, the Korean FOF accounts are reconstructed to form "from-whom-to-whom" basis FOF tables, which are composed of 115 institutional sectors and correspond to tables and techniques of input–output (I–O) analysis. First, power of dispersion indices are obtained by applying the I–O analysis method. Most service and IT industries, construction, and light industries in manufacturing are included in the first quadrant group, whereas heavy and chemical industries are placed in the fourth quadrant since their power indices in the asset-oriented system are comparatively smaller than those of other institutional sectors. Second, investments and savings, which are induced by the central bank, are calculated for monetary policy evaluations. Industries are bifurcated into two groups to compare their features. The first group refers to industries whose power of dispersion in the asset-oriented system is greater than 1, whereas the second group indicates that their index is less than 1. We found that the net induced investments (NII)–total liabilities ratios of the first group show levels half those of the second group since the former's induced savings are obviously greater than the latter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to show and put together the results, obtained so far, useful to tackle a conjecture of graph theory proposed in 1954 by William Thomas Tutte. The conjecture in question is Tutte's 5-flow conjecture, which states that every bridgeless graph admits a nowhere-zero 5-flow, namely a flow with non-zero integer values between -4 and 4. We will start by giving some basics on graph theory, useful for the followings, and proving some results about flows on oriented graphs and in particular about the flow polynomial. Next we will treat two cases: graphs embeddable in the plane $\mathbb{R}^2$ and graphs embeddable in the projective plane $\mathbb{P}^2$. In the first case we will see the correlation between flows and colorings and prove a theorem even stronger than Tutte's conjecture, using the 4-color theorem. In the second case we will see how in 1984 Richard Steinberg used Fleischner's Splitting Lemma to show that there can be no minimal counterexample of the conjecture in the case of graphs in the projective plane. In the fourth chapter we will look at the theorems of François Jaeger (1976) and Paul D. Seymour (1981). The former proved that every bridgeless graph admits a nowhere-zero 8-flow, the latter managed to go even further showing that every bridgeless graph admits a nowhere-zero 6-flow. In the fifth and final chapter there will be a short introduction to the Tutte polynomial and it will be shown how it is related to the flow polynomial via the Recipe Theorem. Finally we will see some applications of flows through the study of networks and their properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapidity-odd directed flow (v1) measurements for charged pions, protons, and antiprotons near midrapidity (y=0) are reported in sNN=7.7, 11.5, 19.6, 27, 39, 62.4, and 200 GeV Au+Au collisions as recorded by the STAR detector at the Relativistic Heavy Ion Collider. At intermediate impact parameters, the proton and net-proton slope parameter dv1/dy|y=0 shows a minimum between 11.5 and 19.6 GeV. In addition, the net-proton dv1/dy|y=0 changes sign twice between 7.7 and 39 GeV. The proton and net-proton results qualitatively resemble predictions of a hydrodynamic model with a first-order phase transition from hadronic matter to deconfined matter, and differ from hadronic transport calculations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Response surface methodology based on Box-Behnken (BBD) design was successfully applied to the optimization in the operating conditions of the electrochemical oxidation of sanitary landfill leachate aimed for making this method feasible for scale up. Landfill leachate was treated in continuous batch-recirculation system, where a dimensional stable anode (DSA(©)) coated with Ti/TiO2 and RuO2 film oxide were used. The effects of three variables, current density (milliampere per square centimeter), time of treatment (minutes), and supporting electrolyte dosage (moles per liter) upon the total organic carbon removal were evaluated. Optimized conditions were obtained for the highest desirability at 244.11 mA/cm(2), 41.78 min, and 0.07 mol/L of NaCl and 242.84 mA/cm(2), 37.07 min, and 0.07 mol/L of Na2SO4. Under the optimal conditions, 54.99 % of chemical oxygen demand (COD) and 71.07 ammonia nitrogen (NH3-N) removal was achieved with NaCl and 45.50 of COD and 62.13 NH3-N with Na2SO4. A new kinetic model predicted obtained from the relation between BBD and the kinetic model was suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We reported here for the first time that triboelectric charges on PET sheets can be used to seal and control the flow rate in paper-based devices. The proposed method exhibits simplicity and low cost, provides reversible sealing and minimizes the effect of sample evaporation.