874 resultados para Dynamic data analysis
Resumo:
A tanulmány arra a feltevésre épül, hogy minél erősebb a bizalomra méltóság szintje egy adott üzleti kapcsolatban, annál inkább igaz, hogy nagy kockázatú tevékenységek mennek végbe benne. Ilyen esetekben a bizalomra méltóság a kapcsolatban zajló események, cselekvések irányítási eszközévé válik, és az üzleti kapcsolatban megjelenik a cselekvési hajlandóságként értelmezett bizalom. A tanulmány felhívja a figyelmet a bizalom és a bizalomra méltóság fogalmai közötti különbségre, szisztematikus különválasztásuk fontosságára. Bemutatja az úgynevezett diadikus adatelemzés gazdálkodástudományi alkalmazását. Empirikus eredményei is igazolják, hogy ezzel a módszerrel az üzleti kapcsolatok társas jellemzőinek (köztük a bizalomnak) és a közöttük lévő kapcsolatoknak mélyebb elemzésére nyílik lehetőség. ____ The paper rests on the behavioral interpretation of trust, making a clear distinction between trustworthiness (honesty) and trust interpreted as willingness to engage in risky situations with specific partners. The hypothesis tested is that in a business relation marked by high levels of trustworthiness as perceived by the opposite parties, willingness to be involved in risky situations is higher than it is in relations where actors do not believe their partners to be highly trustworthy. Testing this hypothesis clearly calls for dyadic operationalization, measurement, and analysis. The authors present the first economic application of a newly developed statistical technique called dyadic data analysis, which has already been applied in social psychology. It clearly overcomes the problem of single-ended research in business relations analysis and allows a deeper understanding of any dyadic phenomenon, including trust/trustworthiness as a governance mechanism.
Resumo:
With the latest development in computer science, multivariate data analysis methods became increasingly popular among economists. Pattern recognition in complex economic data and empirical model construction can be more straightforward with proper application of modern softwares. However, despite the appealing simplicity of some popular software packages, the interpretation of data analysis results requires strong theoretical knowledge. This book aims at combining the development of both theoretical and applicationrelated data analysis knowledge. The text is designed for advanced level studies and assumes acquaintance with elementary statistical terms. After a brief introduction to selected mathematical concepts, the highlighting of selected model features is followed by a practice-oriented introduction to the interpretation of SPSS1 outputs for the described data analysis methods. Learning of data analysis is usually time-consuming and requires efforts, but with tenacity the learning process can bring about a significant improvement of individual data analysis skills.
Resumo:
The exploration and development of oil and gas reserves located in harsh offshore environments are characterized with high risk. Some of these reserves would be uneconomical if produced using conventional drilling technology due to increased drilling problems and prolonged non-productive time. Seeking new ways to reduce drilling cost and minimize risks has led to the development of Managed Pressure Drilling techniques. Managed pressure drilling methods address the drawbacks of conventional overbalanced and underbalanced drilling techniques. As managed pressure drilling techniques are evolving, there are many unanswered questions related to safety and operating pressure regimes. Quantitative risk assessment techniques are often used to answer these questions. Quantitative risk assessment is conducted for the various stages of drilling operations – drilling ahead, tripping operation, casing and cementing. A diagnostic model for analyzing the rotating control device, the main component of managed pressure drilling techniques, is also studied. The logic concept of Noisy-OR is explored to capture the unique relationship between casing and cementing operations in leading to well integrity failure as well as its usage to model the critical components of constant bottom-hole pressure drilling technique of managed pressure drilling during tripping operation. Relevant safety functions and inherent safety principles are utilized to improve well integrity operations. Loss function modelling approach to enable dynamic consequence analysis is adopted to study blowout risk for real-time decision making. The aggregation of the blowout loss categories, comprising: production, asset, human health, environmental response and reputation losses leads to risk estimation using dynamically determined probability of occurrence. Lastly, various sub-models developed for the stages/sub-operations of drilling operations and the consequence modelling approach are integrated for a holistic risk analysis of drilling operations.
Resumo:
Acknowledgements The authors acknowledge the projects supported by the National Basic Research Program of China (973 Project)(No. 2015CB057405) and the National Natural Science Foundation of China (No. 11372082) and the State Scholarship Fund of CSC. DW thanks for the hospitality of the University of Aberdeen.
Resumo:
Peer reviewed
Resumo:
Due to high-speed rotation, the problems about rotor mechanics and dynamics for outer rotor high-speed machine are more serious than conventional ones, in view of above problems the mechanical and dynamics analysis for an outer rotor high-speed permanent magnet claw pole motor are carried out. The rotor stress analytical calculation model was derived, then the stress distribution is calculated by finite element method also, which is coincided with that calculated by analytical model. In addition, the stress distribution of outer rotor yoke and PMs considering centrifugal force and temperature effect has been calculated, some influence factors on rotor stress distribution have been analyzed such as pole-arc coefficient and speed. The rotor natural frequency and critical speed were calculated by vibration mode analysis, and its dynamics characteristics influenced by gyroscope effect were analyzed based on Campbell diagram. Based on the analysis results above an outer rotor permanent magnet high-speed claw pole motor is design and verified.
Resumo:
A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.
Resumo:
This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.
Resumo:
Market research is often conducted through conventional methods such as surveys, focus groups and interviews. But the drawbacks of these methods are that they can be costly and timeconsuming. This study develops a new method, based on a combination of standard techniques like sentiment analysis and normalisation, to conduct market research in a manner that is free and quick. The method can be used in many application-areas, but this study focuses mainly on the veganism market to identify vegan food preferences in the form of a profile. Several food words are identified, along with their distribution between positive and negative sentiments in the profile. Surprisingly, non-vegan foods such as cheese, cake, milk, pizza and chicken dominate the profile, indicating that there is a significant market for vegan-suitable alternatives for such foods. Meanwhile, vegan-suitable foods such as coconut, potato, blueberries, kale and tofu also make strong appearances in the profile. Validation is performed by using the method on Volkswagen vehicle data to identify positive and negative sentiment across five car models. Some results were found to be consistent with sales figures and expert reviews, while others were inconsistent. The reliability of the method is therefore questionable, so the results should be used with caution.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
The object of this report is to present the data and conclusions drawn from the analysis of the origin and destination information. Comments on the advisability and correctness of the approach used by Iowa are encouraged.
Resumo:
New morpho-bathymetric and tectono-stratigraphic data on Naples and Salerno Gulfs, derived from bathymetric and seismic data analysis and integrated geologic interpretation are here presented. The CUBE(Combined Uncertainty Bathymetric Estimator) method has been applied to complex morphologies, such as the Capri continental slope and the related geological structures occurring in the Salerno Gulf.The bathymetric data analysis has been carried out for marine geological maps of the whole Campania continental margin at scales ranging from 1:25.000 to 1:10.000, including focused examples in Naples and Salerno Gulfs, Naples harbour, Capri and Ischia Islands and Salerno Valley. Seismic data analysis has allowed for the correlation of main morpho-structural lineaments recognized at a regional scale through multichannel profiles with morphological features cropping out at the sea bottom, evident from bathymetry.Main fault systems in the area have been represented on a tectonic sketch map, including the master fault located northwards to the Salerno Valley half graben. Some normal faults parallel to the master fault have been interpreted from the slope map derived from bathymetric data. A complex system of antithetic faults bound two morpho-structural highs located 20km to the south of the Capri Island. Some hints of compressional reactivation of normal faults in an extensional setting involving the whole Campania continental margin have been shown from seismic interpretation.
Resumo:
Wind-generated waves in the Kara, Laptev, and East-Siberian Seas are investigated using altimeter data from Envisat RA-2 and SARAL-AltiKa. Only isolated ice-free zones had been selected for analysis. Wind seas can be treated as pure wind-generated waves without any contamination by ambient swell. Such zones were identified using ice concentration data from microwave radiometers. Altimeter data, both significant wave height (SWH) and wind speed, for these areas were further obtained for the period 2002-2012 using Envisat RA-2 measurements, and for 2013 using SARAL-AltiKa. Dependencies of dimensionless SWH and wavelength on dimensionless wave generation spatial scale are compared to known empirical dependencies for fetch-limited wind wave development. We further check sensitivity of Ka- and Ku-band and discuss new possibilities that AltiKa's higher resolution can open.
Resumo:
This text is taken from the postgraduate thesis, which one of the authors (A.B.) developed for the degree of Medical Physicist in the School on Medical Physics of the University of Florence. The text explores the feasibility of quantitative Magnetic Resonance Spectroscopy as a tool for daily clinical routine use. The results and analysis comes from two types of hyper spectral images: the first set are hyper spectral images coming from a standard phantom (reference images); and hyper spectral images obtained from a group of patients who have undergone MRI examinations at the Santa Maria Nuova Hospital. This interdisciplinary work stems from the IFAC-CNR know how in terms of data analysis and nanomedicine, and the clinical expertise of Radiologists and Medical Physicists. The results reported here, which were the subject of the thesis, are original, unpublished, and represent independent work.