912 resultados para average complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a result of the growing interest in studying employee well-being as a complex process that portrays high levels of within-individual variability and evolves over time, this present study considers the experience of flow in the workplace from a nonlinear dynamical systems approach. Our goal is to offer new ways to move the study of employee well-being beyond linear approaches. With nonlinear dynamical systems theory as the backdrop, we conducted a longitudinal study using the experience sampling method and qualitative semi-structured interviews for data collection; 6981 registers of data were collected from a sample of 60 employees. The obtained time series were analyzed using various techniques derived from the nonlinear dynamical systems theory (i.e., recurrence analysis and surrogate data) and multiple correspondence analyses. The results revealed the following: 1) flow in the workplace presents a high degree of within-individual variability; this variability is characterized as chaotic for most of the cases (75%); 2) high levels of flow are associated with chaos; and 3) different dimensions of the flow experience (e.g., merging of action and awareness) as well as individual (e.g., age) and job characteristics (e.g., job tenure) are associated with the emergence of different dynamic patterns (chaotic, linear and random).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel unsymmetric dinucleating ligand (LN3N4) combining a tridentate and a tetradentate binding sites linked through a m-xylyl spacer was synthesized as ligand scaffold for preparing homo- and dimetallic complexes, where the two metal ions are bound in two different coordination environments. Site-selective binding of different metal ions is demonstrated. LN3N4 is able to discriminate between CuI and a complementary metal (M′ = CuI, ZnII, FeII, CuII, or GaIII) so that pure heterodimetallic complexes with a general formula [CuIM′(LN3N4)]n+ are synthesized. Reaction of the dicopper(I) complex [CuI 2(LN3N4)]2+ with O2 leads to the formation of two different copper-dioxygen (Cu2O2) intermolecular species (O and TP) between two copper atoms located in the same site from different complex molecules. Taking advantage of this feature, reaction of the heterodimetallic complexes [CuM′(LN3N4)]n+ with O2 at low temperature is used as a tool to determine the final position of the CuI center in the system because only one of the two Cu2O2 species is formed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La integració en una mateixa activitat d'una explotació ramadera ecològica basada en el bestiar cabrú de llet i d'una formatgeria artesanal basada en varietats autòctones de formatge de cabra, permet ocupar un nínxol de mercat encara poc explotat actualment a Catalunya. D'igual manera, la dificultat que tenen actualment molts dels formatgers artesans del país per trobar llet de cabra en quantitats suficients i amb unes mínimes garanties higiènico-sanitàries dificulten l'avanç d'un sector amb molt bones perspectives de futur com és el de la formatgeria artesanal i justifica la necessitat de crear noves explotacions (o reconvertir-ne de ja existents) orientant-les de ple vers l'aprofitament lleter del bestiar cabrú, que és actualment molt reduït en l'àmbit Català. A més d'això, les actuals exigències en el camp del benestar animal així com en el camp de la qualitat i de la seguretat alimentaria, exigeixen a les explotacions ramaderes i a les indústries alimentàries el compliment d'una sèrie de requisits, tan de disseny dels locals i de les instal•lacions, com de control de les activitats i dels productes elaborats, que garanteixin en tot moment la seva correcte gestió i funcionament en aquestes matèries. L'objectiu d'aquest projecte, doncs, és la legalització d'una activitat, que anomenarem “Mas Peirot, S.L.”, dedicada a l'explotació ramadera ecològica de bestiar cabrú de llet i a la formatgeria artesanal de formatge de cabra. És important mencionar que tot i tractar-se d'un projecte de legalització d'una activitat,, també s'hi han contemplat aspectes i detalls relacionats amb l'execució de la mateixa, al tractar-se d'unes obres d'una certa complexitat i considerant que així es tracten amb més detall i rigorositat les actuacions projectades. En concret, en aquest projecte es planifiquen les operacions de rehabilitació de dues naus ramaderes actualment en desús, per tal que puguin destinar-se a l'explotació ramadera de bestiar cabrú i a la formatgeria artesanal, projectant-se així mateix totes les instal•lacions necessàries per tal que a les mencionades naus s'hi puguin desenvolupar les activitats citades. Com a conclusions més importants d'aquest projecte podem dir que amb un cens de 100 caps de bestiar cabrú és possible obtenir una producció mitjana de llet de 32.500 litres anuals, que es corresponen a un volum total de 3.823 quilograms de formatge a l'any (de les tipologies Formatge Garrotxa i Formatge de cabra amb oli, ambdues varietats tradicionals catalanes) obtenint un benefici de 21.389,47 Euros/any durant els primers 15 anys i un benefici de 33.386,39 Euros/any a partir del 16è any, cosa que permetrà recuperar la inversió inicial de 172.716,76 Euros que valen les naus i instal•lacions projectades, en un termini de 8 anys i permetrà, com s'ha dit, ocupar un nínxol de mercat encara poc explotat per la indústria agroalimentària a Catalunya.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In wireless communications the transmitted signals may be affected by noise. The receiver must decode the received message, which can be mathematically modelled as a search for the closest lattice point to a given vector. This problem is known to be NP-hard in general, but for communications applications there exist algorithms that, for a certain range of system parameters, offer polynomial expected complexity. The purpose of the thesis is to study the sphere decoding algorithm introduced in the article On Maximum-Likelihood Detection and the Search for the Closest Lattice Point, which was published by M.O. Damen, H. El Gamal and G. Caire in 2003. We concentrate especially on its computational complexity when used in space–time coding. Computer simulations are used to study how different system parameters affect the computational complexity of the algorithm. The aim is to find ways to improve the algorithm from the complexity point of view. The main contribution of the thesis is the construction of two new modifications to the sphere decoding algorithm, which are shown to perform faster than the original algorithm within a range of system parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asphaltenes are blamed for various problems in the petroleum industry, especially formation of solid deposits and stabilization of water-in-oil emulsions. Many studies have been conducted to characterize chemical structures of asphaltenes and assess their phase behavior in crude oil or in model-systems of asphaltenes extracted from oil or asphaltic residues from refineries. However, due to the diversity and complexity of these structures, there is still much to be investigated. In this study, asphaltene (sub)fractions were extracted from an asphaltic residue (AR02), characterized by NMR, elemental analysis, X-ray fluorescence and MS-TOF, and compared to asphaltene subfractions obtained from another asphaltic residue (AR01) described in a previous article. The (sub)fractions obtained from the two residues were used to prepare model-systems containing 1 wt% of asphaltenes in toluene and their phase behavior was evaluated by measuring asphaltene precipitation onset using optical microscopy. The results obtained indicated minor differences between the asphaltene fractions obtained from the asphaltic residues of distinct origins, with respect to aromaticity, elemental composition (CHN), presence and content of heteroelements and average molar mass. Regarding stability, minor differences in molecule polarity appear to promote major differences in the phase behavior of each of the asphaltene fractions isolated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A company’s competence to manage its product portfolio complexity is becoming critically important in the rapidly changing business environment. The continuous evolvement of customer needs, the competitive market environment and internal product development lead to increasing complexity in product portfolios. The companies that manage the complexity in product development are more profitable in the long run. The complexity derives from product development and management processes where the new product variant development is not managed efficiently. Complexity is managed with modularization which is a method that divides the product structure into modules. In modularization, it is essential to take into account the trade-off between the perceived customer value and the module or component commonality across the products. Another goal is to enable the product configuration to be more flexible. The benefits are achieved through optimizing complexity in module offering and deriving the new product variants more flexibly and accurately. The developed modularization process includes the process steps for preparation, mapping the current situation, the creation of a modular strategy and implementing the strategy. Also the organization and support systems have to be adapted to follow-up targets and to execute modularization in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of intensity-modulated radiotherapy (IMRT) has increased extensively in the modern radiotherapy (RT) treatments over the past two decades. Radiation dose distributions can be delivered with higher conformality with IMRT when compared to the conventional 3D-conformal radiotherapy (3D-CRT). Higher conformality and target coverage increases the probability of tumour control and decreases the normal tissue complications. The primary goal of this work is to improve and evaluate the accuracy, efficiency and delivery techniques of RT treatments by using IMRT. This study evaluated the dosimetric limitations and possibilities of IMRT in small (treatments of head-and-neck, prostate and lung cancer) and large volumes (primitive neuroectodermal tumours). The dose coverage of target volumes and the sparing of critical organs were increased with IMRT when compared to 3D-CRT. The developed split field IMRT technique was found to be safe and accurate method in craniospinal irradiations. By using IMRT in simultaneous integrated boosting of biologically defined target volumes of localized prostate cancer high doses were achievable with only small increase in the treatment complexity. Biological plan optimization increased the probability of uncomplicated control on average by 28% when compared to standard IMRT delivery. Unfortunately IMRT carries also some drawbacks. In IMRT the beam modulation is realized by splitting a large radiation field to small apertures. The smaller the beam apertures are the larger the rebuild-up and rebuild-down effects are at the tissue interfaces. The limitations to use IMRT with small apertures in the treatments of small lung tumours were investigated with dosimetric film measurements. The results confirmed that the peripheral doses of the small lung tumours were decreased as the effective field size was decreased. The studied calculation algorithms were not able to model the dose deficiency of the tumours accurately. The use of small sliding window apertures of 2 mm and 4 mm decreased the tumour peripheral dose by 6% when compared to 3D-CRT treatment plan. A direct aperture based optimization (DABO) technique was examined as a solution to decrease the treatment complexity. The DABO IMRT technique was able to achieve treatment plans equivalent with the conventional IMRT fluence based optimization techniques in the concave head-and-neck target volumes. With DABO the effective field sizes were increased and the number of MUs was reduced with a factor of two. The optimality of a treatment plan and the therapeutic ratio can be further enhanced by using dose painting based on regional radiosensitivities imaged with functional imaging methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis examines the profitability of DMAC trading rules in the Finnish stock market over the 1996-2012 period. It contributes to the existing technical analysis literature by comparing for the first time the performance of DMAC strategies based on individual stock trading portfolios to the performance of index trading strategies based on the trading on the index (OMX Helsinki 25) that consists of the same stocks. Besides, the market frictions including transaction costs and taxes are taken into account, and the results are reported from both institutional and individual investor’s perspective. Performance characteristic of DMAC rules are evaluated by simulating 19,900 different trading strategies in total for two non- overlapping 8-year sub-periods, and decomposing the full-sample-period performance of DMAC trading strategies into distinct bullish- and bearish-period performances. The results show that the best DMAC rules have predictive power on future price trends, and these rules are able to outperform buy-and-hold strategy. Although the performance of the DMAC strategies is highly dependent on the combination of moving average lengths, the best DMAC rules of the first sub-period have also performed well during the latter sub-period in the case of individual stock trading strategies. According to the results, the outperformance of DMAC trading rules over buy-and-hold strategy is mostly attributed to their superiority during the bearish periods, and particularly, during stock market crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes an approach to overcoming the complexity of software product management (SPM) and consists of several studies that investigate the activities and roles in product management, as well as issues related to the adoption of software product management. The thesis focuses on organizations that have started the adoption of SPM but faced difficulties due to its complexity and fuzziness and suggests the frameworks for overcoming these challenges using the principles of decomposition and iterative improvements. The research process consisted of three phases, each of which provided complementary results and empirical observation to the problem of overcoming the complexity of SPM. Overall, product management processes and practices in 13 companies were studied and analysed. Moreover, additional data was collected with a survey conducted worldwide. The collected data were analysed using the grounded theory (GT) to identify the possible ways to overcome the complexity of SPM. Complementary research methods, like elements of the Theory of Constraints were used for deeper data analysis. The results of the thesis indicate that the decomposition of SPM activities depending on the specific characteristics of companies and roles is a useful approach for simplifying the existing SPM frameworks. Companies would benefit from the results by adopting SPM activities more efficiently and effectively and spending fewer resources on its adoption by concentrating on the most important SPM activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex System is any system that presents involved behavior, and is hard to be modeled by using the reductionist approach of successive subdivision, searching for ''elementary'' constituents. Nature provides us with plenty of examples of these systems, in fields as diverse as biology, chemistry, geology, physics, and fluid mechanics, and engineering. What happens, in general, is that for these systems we have a situation where a large number of both attracting and unstable chaotic sets coexist. As a result, we can have a rich and varied dynamical behavior, where many competing behaviors coexist. In this work, we present and discuss simple mechanical systems that are nice paradigms of Complex System, when they are subjected to random external noise. We argue that systems with few degrees of freedom can present the same complex behavior under quite general conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical exercise is associated with parasympathetic withdrawal and increased sympathetic activity resulting in heart rate increase. The rate of post-exercise cardiodeceleration is used as an index of cardiac vagal reactivation. Analysis of heart rate variability (HRV) and complexity can provide useful information about autonomic control of the cardiovascular system. The aim of the present study was to ascertain the association between heart rate decrease after exercise and HRV parameters. Heart rate was monitored in 17 healthy male subjects (mean age: 20 years) during the pre-exercise phase (25 min supine, 5 min standing), during exercise (8 min of the step test with an ascending frequency corresponding to 70% of individual maximal power output) and during the recovery phase (30 min supine). HRV analysis in the time and frequency domains and evaluation of a newly developed complexity measure - sample entropy - were performed on selected segments of heart rate time series. During recovery, heart rate decreased gradually but did not attain pre-exercise values within 30 min after exercise. On the other hand, HRV gradually increased, but did not regain rest values during the study period. Heart rate complexity was slightly reduced after exercise and attained rest values after 30-min recovery. The rate of cardiodeceleration did not correlate with pre-exercise HRV parameters, but positively correlated with HRV measures and sample entropy obtained from the early phases of recovery. In conclusion, the cardiodeceleration rate is independent of HRV measures during the rest period but it is related to early post-exercise recovery HRV measures, confirming a parasympathetic contribution to this phase.