978 resultados para Space representation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article discusses three possible ways to derive time domain boundary integral representations for elastodynamics. This discussion points out possible difficulties found when using those formulations to deal with practical applications. The discussion points out recommendations to select the convenient integral representation to deal with elastodynamic problems and opens the possibility of deriving simplified schemes. The proper way to take into account initial conditions applied to the body is an interesting topict shown. It illustrates the main differences between the discussed boundary integral representation expressions, their singularities and possible numerical problems. The correct way to use collocation points outside the analyzed domain is carefully described. Some applications are shown at the end of the paper, in order to demonstrate the capabilities of the technique when properly used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiprocessor system-on-chip (MPSoC) designs utilize the available technology and communication architectures to meet the requirements of the upcoming applications. In MPSoC, the communication platform is both the key enabler, as well as the key differentiator for realizing efficient MPSoCs. It provides product differentiation to meet a diverse, multi-dimensional set of design constraints, including performance, power, energy, reconfigurability, scalability, cost, reliability and time-to-market. The communication resources of a single interconnection platform cannot be fully utilized by all kind of applications, such as the availability of higher communication bandwidth for computation but not data intensive applications is often unfeasible in the practical implementation. This thesis aims to perform the architecture-level design space exploration towards efficient and scalable resource utilization for MPSoC communication architecture. In order to meet the performance requirements within the design constraints, careful selection of MPSoC communication platform, resource aware partitioning and mapping of the application play important role. To enhance the utilization of communication resources, variety of techniques such as resource sharing, multicast to avoid re-transmission of identical data, and adaptive routing can be used. For implementation, these techniques should be customized according to the platform architecture. To address the resource utilization of MPSoC communication platforms, variety of architectures with different design parameters and performance levels, namely Segmented bus (SegBus), Network-on-Chip (NoC) and Three-Dimensional NoC (3D-NoC), are selected. Average packet latency and power consumption are the evaluation parameters for the proposed techniques. In conventional computing architectures, fault on a component makes the connected fault-free components inoperative. Resource sharing approach can utilize the fault-free components to retain the system performance by reducing the impact of faults. Design space exploration also guides to narrow down the selection of MPSoC architecture, which can meet the performance requirements with design constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biofuels for transport are a renewable source of energy that were once heralded as a solution to multiple problems associated with poor urban air quality, the overproduction of agricultural commodities, the energy security of the European Union (EU) and climate change. It was only after the Union had implemented an incentivizing framework of legal and political instruments for the production, trade and consumption of biofuels that the problems of weakening food security, environmental degradation and increasing greenhouse gases through land-use changes began to unfold. In other words, the difference between political aims for why biofuels are promoted and their consequences has grown – which is also recognized by the EU policy-makers. Therefore, the global networks of producing, trading and consuming biofuels may face a complete restructure if the European Commission accomplishes its pursuit to sideline crop-based biofuels after 2020. My aim with this dissertation is not only to trace the manifold evolutions of the instruments used by the Union to govern biofuels but also to reveal how this evolution has influenced the dynamics of biofuel development. Therefore, I study the ways the EU’s legal and political instruments of steering biofuels are coconstitutive with the globalized spaces of biofuel development. My analytical strategy can be outlined through three concepts. I use the term ‘assemblage’ to approach the operations of the loose entity of actors and non-human elements that are the constituents of multi-scalar and -sectorial biofuel development. ‘Topology’ refers to the spatiality of this European biofuel assemblage and its parts whose evolving relations are treated as the active constituents of space, instead of simply being located in space. I apply the concept of ‘nomosphere’ to characterize the framework of policies, laws and other instruments that the EU applies and construes while attempting to govern biofuels. Even though both the materials and methods vary in the independent articles, these three concepts characterize my analytical strategy that allows me to study law, policy and space associated with each other. The results of my examinations underscore the importance of the instruments of governance of the EU constituting and stabilizing the spaces of producing and, on the other hand, how topological ruptures in biofuel development have enforced the need to reform policies. This analysis maps the vast scope of actors that are influenced by the mechanism of EU biofuel governance and, what is more, shows how they are actively engaging in the Union’s institutional policy formulation. By examining the consequences of fast biofuel development that are spatially dislocated from the established spaces of producing, trading and consuming biofuels such as indirect land use changes, I unfold the processes not tackled by the instruments of the EU. Indeed, it is these spatially dislocated processes that have pushed the Commission construing a new type of governing biofuels: transferring the instruments of climate change mitigation to land-use policies. Although efficient in mitigating these dislocated consequences, these instruments have also created peculiar ontological scaffolding for governing biofuels. According to this mode of governance, the spatiality of biofuel development appears to be already determined and the agency that could dampen the negative consequences originating from land-use practices is treated as irrelevant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theme of this thesis is context-speci c independence in graphical models. Considering a system of stochastic variables it is often the case that the variables are dependent of each other. This can, for instance, be seen by measuring the covariance between a pair of variables. Using graphical models, it is possible to visualize the dependence structure found in a set of stochastic variables. Using ordinary graphical models, such as Markov networks, Bayesian networks, and Gaussian graphical models, the type of dependencies that can be modeled is limited to marginal and conditional (in)dependencies. The models introduced in this thesis enable the graphical representation of context-speci c independencies, i.e. conditional independencies that hold only in a subset of the outcome space of the conditioning variables. In the articles included in this thesis, we introduce several types of graphical models that can represent context-speci c independencies. Models for both discrete variables and continuous variables are considered. A wide range of properties are examined for the introduced models, including identi ability, robustness, scoring, and optimization. In one article, a predictive classi er which utilizes context-speci c independence models is introduced. This classi er clearly demonstrates the potential bene ts of the introduced models. The purpose of the material included in the thesis prior to the articles is to provide the basic theory needed to understand the articles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the present study was to determine the levels of amino acids in maternal plasma, placental intervillous space and fetal umbilical vein in order to identify the similarities and differences in amino acid levels in these compartments of 15 term newborns from normal pregnancies and deliveries. All amino acids, except tryptophan, were present in at least 186% higher concentrations in the intervillous space than in maternal venous blood, with the difference being statistically significant. This result contradicted the initial hypothesis of the study that the plasma amino acid levels in the placental intervillous space should be similar to those of maternal plasma. When the maternal venous compartment was compared with the umbilical vein, we observed values 103% higher on the fetal side which is compatible with currently accepted mechanisms of active amino acid transport. Amino acid levels of the placental intervillous space were similar to the values of the umbilical vein except for proline, glycine and aspartic acid, whose levels were significantly higher than fetal umbilical vein levels (average 107% higher). The elevated levels of the intervillous space are compatible with syncytiotrophoblast activity, which maintain high concentrations of free amino acids inside syncytiotrophoblast cells, permitting asymmetric efflux or active transport from the trophoblast cells to the blood in the intervillous space. The plasma amino acid levels in the umbilical vein of term newborns probably may be used as a standard of local normality for clinical studies of amino acid profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plasma amino acid levels have never been studied in the placental intervillous space of preterm gestations. Our objective was to determine the possible relationship between plasma amino acids of maternal venous blood (M), of the placental intervillous space (PIVS) and of the umbilical vein (UV) of preterm newborn infants. Plasma amino acid levels were analyzed by ion-exchange chromatography in M from 14 parturients and in the PIVS and UV of their preterm newborn infants. Mean gestational age was 34 ± 2 weeks, weight = 1827 ± 510 g, and all newborns were considered adequate for gestational age. The mean Apgar score was 8 and 9 at the first and fifth minutes. Plasma amino acid values were significantly lower in M than in PIVS (166%), except for aminobutyric acid. On average, plasma amino acid levels were significantly higher in UV than in M (107%) and were closer to PIVS than to M values, except for cystine and aminobutyric acid (P < 0.05). Comparison of the mean plasma amino acid concentrations in the UV of preterm to those of term newborn infants previously studied by our group showed no significant difference, except for proline (P < 0.05), preterm > term. These data suggest that the mechanisms of active amino acid transport are centralized in the syncytiotrophoblast, with their passage to the fetus being an active bidirectional process with asymmetric efflux. PIVS could be a reserve amino acid space for the protection of the fetal compartment from inadequate maternal amino acid variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this thesis is to estimate the effect of the form of knowledge representation on the efficiency of knowledge sharing. The objectives include the design of an experimental framework which would allow to establish this effect, data collection, and statistical analysis of the collected data. The study follows the experimental quantitative design. The experimental questionnaire features three sample forms of knowledge: text, mind maps, concept maps. In the interview, these forms are presented to an interviewee, afterwards the knowledge sharing time and knowledge sharing quality are measured. According to the statistical analysis of 76 interviews, text performs worse in both knowledge sharing time and quality compared to visualized forms of knowledge representation. However, mind maps and concept maps do not differ in knowledge sharing time and quality, since this difference is not statistically significant. Since visualized structured forms of knowledge perform better than unstructured text in knowledge sharing, it is advised for companies to foster the usage of these forms in knowledge sharing processes inside the company. Aside of performance in knowledge sharing, the visualized structured forms are preferable due the possibility of their usage in the system of ontological knowledge management within an enterprise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to explore the clinical efficacy of a novel retrograde puncture approach to establish a preperitoneal space for laparoscopic direct inguinal hernia repair with inguinal ring suturing. Forty-two patients who underwent laparoscopic inguinal hernia repair with retrograde puncture for preperitoneal space establishment as well as inguinal ring suturing between August 2013 and March 2014 at our hospital were enrolled. Preperitoneal space was successfully established in all patients, with a mean establishment time of 6 min. Laparoscopic repairs were successful in all patients, with a mean surgical time of 26±15.1 min. Mean postoperative hospitalization duration was 3.0±0.7 days. Two patients suffered from postoperative local hematomas, which were relieved after puncturing and drainage. Four patients had short-term local pain. There were no cases of chronic pain. Patients were followed up for 6 months to 1 year, and no recurrence was observed. Our results demonstrate that preperitoneal space established by the retrograde puncture technique can be successfully used in adult laparoscopic hernioplasty to avoid intraoperative mesh fixation, and thus reduce medical costs.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This guide summarizes useful information about the European Space Agency (ESA), the European space industry, the ECSS standards and product assurance for small and medium enterprises that are aiming to enter the industry. Additionally, the applicability of agile development in space projects is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was the assessment of volatile organic compounds produced by Sporidiobolus salmonicolor (CBS 2636) using methyl and ethyl ricinoleate, ricinoleic acid and castor oil as precursors. The analysis of the volatile organic compounds was carried out using Head Space Solid Phase Micro-Extraction (HS - SPME). Factorial experimental design was used for investigating extraction conditions, verifying stirring rate (0-400 rpm), temperature (25-60 ºC), extraction time (10-30 minutes), and sample volume (2-3 mL). The identification of volatile organic compounds was carried out by Gas Chromatography with Mass Spectrum Detector (GC/MSD). The conditions that resulted in maximum extraction were: 60 ºC, 10 minutes extraction, no stirring, sample volume of 2.0 mL, and addition of saturated KCl (1:10 v/v). In the bio-production of volatile organic compounds the effect of stirring rate (120-200 rpm), temperature (23-33 ºC), pH (4.0-8.0), precursor concentration (0.02-0.1%), mannitol (0-6%), and asparagine concentration (0-0.2%) was investigated. The bio-production at 28 ºC, 160 rpm, pH 6,0 and with the addition of 0.02% ricinoleic acid to the medium yielded the highest production of VOCs, identified as 1,4-butanediol, 1,2,2-trimethylciclopropilamine, beta-ionone; 2,3-butanodione, pentanal, tetradecane, 2-isononenal, 4-octen-3-one, propanoic acid, and octadecane.