77 resultados para Measurement-based quantum computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, cantilever-enhanced photoacoustic spectroscopy (CEPAS) was applied in different drug detection schemes. The study was divided into two different applications: trace detection of vaporized drugs and drug precursors in the gas-phase, and detection of cocaine abuse in hair. The main focus, however, was the study of hair samples. In the gas-phase, methyl benzoate, a hydrolysis product of cocaine hydrochloride, and benzyl methyl ketone (BMK), a precursor of amphetamine and methamphetamine were investigated. In the solid-phase, hair samples from cocaine overdose patients were measured and compared to a drug-free reference group. As hair consists mostly of long fibrous proteins generally called keratin, proteins from fingernails and saliva were also studied for comparison. Different measurement setups were applied in this study. Gas measurements were carried out using quantum cascade lasers (QLC) as a source in the photoacoustic detection. Also, an external cavity (EC) design was used for a broader tuning range. Detection limits of 3.4 particles per billion (ppb) for methyl benzoate and 26 ppb for BMK in 0.9 s were achieved with the EC-QCL PAS setup. The achieved detection limits are sufficient for realistic drug detection applications. The measurements from drug overdose patients were carried out using Fourier transform infrared (FTIR) PAS. The drug-containing hair samples and drug-free samples were both measured with the FTIR-PAS setup, and the measured spectra were analyzed statistically with principal component analysis (PCA). The two groups were separated by their spectra with PCA and proper spectral pre-processing. To improve the method, ECQCL measurements of the hair samples, and studies using photoacoustic microsampling techniques, were performed. High quality, high-resolution spectra with a broad tuning range were recorded from a single hair fiber. This broad tuning range of an EC-QCL has not previously been used in the photoacoustic spectroscopy of solids. However, no drug detection studies were performed with the EC-QCL solid-phase setup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Video transcoding refers to the process of converting a digital video from one format into another format. It is a compute-intensive operation. Therefore, transcoding of a large number of simultaneous video streams requires a large amount of computing resources. Moreover, to handle di erent load conditions in a cost-e cient manner, the video transcoding service should be dynamically scalable. Infrastructure as a Service Clouds currently offer computing resources, such as virtual machines, under the pay-per-use business model. Thus the IaaS Clouds can be leveraged to provide a coste cient, dynamically scalable video transcoding service. To use computing resources e ciently in a cloud computing environment, cost-e cient virtual machine provisioning is required to avoid overutilization and under-utilization of virtual machines. This thesis presents proactive virtual machine resource allocation and de-allocation algorithms for video transcoding in cloud computing. Since users' requests for videos may change at di erent times, a check is required to see if the current computing resources are adequate for the video requests. Therefore, the work on admission control is also provided. In addition to admission control, temporal resolution reduction is used to avoid jitters in a video. Furthermore, in a cloud computing environment such as Amazon EC2, the computing resources are more expensive as compared with the storage resources. Therefore, to avoid repetition of transcoding operations, a transcoded video needs to be stored for a certain time. To store all videos for the same amount of time is also not cost-e cient because popular transcoded videos have high access rate while unpopular transcoded videos are rarely accessed. This thesis provides a cost-e cient computation and storage trade-o strategy, which stores videos in the video repository as long as it is cost-e cient to store them. This thesis also proposes video segmentation strategies for bit rate reduction and spatial resolution reduction video transcoding. The evaluation of proposed strategies is performed using a message passing interface based video transcoder, which uses a coarse-grain parallel processing approach where video is segmented at group of pictures level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The superconducting gap is a basic character of a superconductor. While the cuprates and conventional phonon-mediated superconductors are characterized by distinct d- and s-wave pairing symmetries with nodal and nodeless gap distributions respectively, the superconducting gap distributions in iron-based superconductors are rather diversified. While nodeless gap distributions have been directly observed in Ba1–xKxFe2As2, BaFe2–xCoxAs2, LiFeAs, KxFe2–ySe2, and FeTe1–xSex, the signatures of a nodal superconducting gap have been reported in LaOFeP, LiFeP, FeSe, KFe2As2, BaFe2–xRuxAs2, and BaFe2(As1–xPx)2. Due to the multiplicity of the Fermi surface in these compounds s± and d pairing states can be both nodeless and nodal. A nontrivial orbital structure of the order parameter, in particular the presence of the gap nodes, leads to effects in which the disorder is much richer in dx2–y2-wave superconductors than in conventional materials. In contrast to the s-wave case, the Anderson theorem does not work, and nonmagnetic impurities exhibit a strong pair-breaking influence. In addition, a finite concentration of disorder produces a nonzero density of quasiparticle states at zero energy, which results in a considerable modification of the thermodynamic and transport properties at low temperatures. The influence of order parameter symmetry on the vortex core structure in iron-based pnictide and chalcogenide superconductors has been investigated in the framework of quasiclassical Eilenberger equations. The main results of the thesis are as follows. The vortex core characteristics, such as, cutoff parameter, ξh, and core size, ξ2, determined as the distance at which density of the vortex supercurrent reaches its maximum, are calculated in wide temperature, impurity scattering rate, and magnetic field ranges. The cutoff parameter, ξh(B; T; Г), determines the form factor of the flux-line lattice, which can be obtained in _SR, NMR, and SANS experiments. A comparison among the applied pairing symmetries is done. In contrast to s-wave systems, in dx2–y2-wave superconductors, ξh/ξc2 always increases with the scattering rate Г. Field dependence of the cutoff parameter affects strongly on the second moment of the magnetic field distributions, resulting in a significant difference with nonlocal London theory. It is found that normalized ξ2/ξc2(B/Bc2) dependence is increasing with pair-breaking impurity scattering (interband scattering for s±-wave and intraband impurity scattering for d-wave superconductors). Here, ξc2 is the Ginzburg-Landau coherence length determined from the upper critical field Bc2 = Φ0/2πξ2 c2, where Φ0 is a flux quantum. Two types of ξ2/ξc2 magnetic field dependences are obtained for s± superconductors. It has a minimum at low temperatures and small impurity scattering transforming in monotonously decreasing function at strong scattering and high temperatures. The second kind of this dependence has been also found for d-wave superconductors at intermediate and high temperatures. In contrast, impurity scattering results in decreasing of ξ2/ξc2(B/Bc2) dependence in s++ superconductors. A reasonable agreement between calculated ξh/ξc2 values and those obtained experimentally in nonstoichiometric BaFe2–xCoxAs2 (μSR) and stoichiometric LiFeAs (SANS) was found. The values of ξh/ξc2 are much less than one in case of the first compound and much more than one for the other compound. This is explained by different influence of two factors: the value of impurity scattering rate and pairing symmetry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is a practically relevant paradigm in computing today. Testing is one of the distinct areas where cloud computing can be applied. This study addressed the applicability of cloud computing for testing within organizational and strategic contexts. The study focused on issues related to the adoption, use and effects of cloudbased testing. The study applied empirical research methods. The data was collected through interviews with practitioners from 30 organizations and was analysed using the grounded theory method. The research process consisted of four phases. The first phase studied the definitions and perceptions related to cloud-based testing. The second phase observed cloud-based testing in real-life practice. The third phase analysed quality in the context of cloud application development. The fourth phase studied the applicability of cloud computing in the gaming industry. The results showed that cloud computing is relevant and applicable for testing and application development, as well as other areas, e.g., game development. The research identified the benefits, challenges, requirements and effects of cloud-based testing; and formulated a roadmap and strategy for adopting cloud-based testing. The study also explored quality issues in cloud application development. As a special case, the research included a study on applicability of cloud computing in game development. The results can be used by companies to enhance the processes for managing cloudbased testing, evaluating practical cloud-based testing work and assessing the appropriateness of cloud-based testing for specific testing needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart phones became part and parcel of our life, where mobility provides a freedom of not being bounded by time and space. In addition, number of smartphones produced each year is skyrocketing. However, this also created discrepancies or fragmentation among devices and OSes, which in turn made an exceeding hard for developers to deliver hundreds of similar featured applications with various versions for the market consumption. This thesis is an attempt to investigate whether cloud based mobile development platforms can mitigate and eventually eliminate fragmentation challenges. During this research, we have selected and analyzed the most popular cloud based development platforms and tested integrated cloud features. This research showed that cloud based mobile development platforms may able to reduce mobile fragmentation and enable to utilize single codebase to deliver a mobile application for different platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this Master Thesis we discuss issues related to the measurement of the effective scattering surface, based on the Doppler Effect. Modeling of the detected signal was made. Narrowband signal filtering using low-frequency amplifier was observed. Parameters of the proposed horn antennas were studied; radar cross section charts for three different objects were received.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structure and optical properties of thin films based on C60 materials are studied. Reproducible vacuum method of thin fullerene films production with Cd impurity on Si, glass and mica surfaces is developed. Surface morphology of the films are investigated by AFM and SEM methods. The ab initio quantum - chemical calculations of the geometry, total energy and excited energy states of complex fullerene- cadmium telluride supramolecules are performed. Photoluminescence spectra of composite thin films based on C60 before and after X-ray irradiation were measured. The intensity of additional peaks is defined as the charge composition due to the type of substrate. These results are interpreted as an appearance of the dipole-allowed transitions in the fullerene excited singlet states spectrum cause of an interference with cadmium telluride. X-ray irradiated films were investigated, and additional peaks in photoluminescence spectra were detected. These peaks appear as a result of molecular complexes formation from C60CdTe mixture and dimerization of the films. Density functional B3LYP quantum-chemical calculations for C60CdTe, molecular complexes, (C60)2 and C120O dimers were performed to elucidate some experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this research – which is to critically analyze current theories and methods of intangible assets evaluation and potentially develop and test new methodology based on the practical example(s) in the IT industry. Having this goal in mind the main research questions in this paper will be: What are advantages and disadvantages of the current practices of measurement intellectual capital or valuation of intangible assets? How to properly measure intellectual capital in IT? Resulting method exhibits a new unique approach to the IC measurement and potentially even larger field of application. Despite the fact that in this particular research, I focused my attention on IT (Software and Internet services cluster – to be exact), the logic behind the method is applicable within any industry since the method is designed to be fully compliant with measurement theory and thus can be properly scaled for any application. Building a new method is a difficult and iterative process: in the current iteration the method stands out as rather a theoretical concept rather than a business tool, however even current concept totally fulfills its purpose as a benchmarking tool for measuring intellectual capital in IT industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart home implementation in residential buildings promises to optimize energy usage and save significant amount of energy simply due to a better understanding of user's energy usage profile. Apart from the energy optimisation prospects of this technology, it also aims to guarantee occupants significant amount of comfort and remote control over home appliances both at home locations and at remote places. However, smart home investment just like any other kind of investment requires an adequate measurement and justification of the economic gains it could proffer before its realization. These economic gains could differ for different occupants due to their inherent behaviours and tendencies. Thus it is pertinent to investigate the various behaviours and tendencies of occupants in different domain of interests and to measure the value of the energy savings accrued by smart home implementations in these domains of interest in order to justify such economic gains. This thesis investigates two domains of interests (the rented apartment and owned apartment) for primarily two behavioural tendencies (Finland and Germany) obtained from observation and corroborated by conducted interviews to measure the payback time and Return on Investment (ROI) of their smart home implementations. Also, similar measures are obtained for identified Australian use case. The research finding reveals that building automation for the Finnish behavioural tendencies seems to proffers a better ROI and payback time for smart home implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the new age of Internet of Things (IoT), object of everyday such as mobile smart devices start to be equipped with cheap sensors and low energy wireless communication capability. Nowadays mobile smart devices (phones, tablets) have become an ubiquitous device with everyone having access to at least one device. There is an opportunity to build innovative applications and services by exploiting these devices’ untapped rechargeable energy, sensing and processing capabilities. In this thesis, we propose, develop, implement and evaluate LoadIoT a peer-to-peer load balancing scheme that can distribute tasks among plethora of mobile smart devices in the IoT world. We develop and demonstrate an android-based proof of concept load-balancing application. We also present a model of the system which is used to validate the efficiency of the load balancing approach under varying application scenarios. Load balancing concepts can be apply to IoT scenario linked to smart devices. It is able to reduce the traffic send to the Cloud and the energy consumption of the devices. The data acquired from the experimental outcomes enable us to determine the feasibility and cost-effectiveness of a load balanced P2P smart phone-based applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this master’s thesis was to find out, how to improve customer experience management and measurement. This study is a qualitative case study, in which the data collection method has been interviews. In addition, some of the company’s customer experience measurement methods have been analyzed. The theoretical background is applied in practice by interviewing 5 representatives from the case company. In the case company, the management has launched a customer experience focused program, and given guidelines for customer experience improvement. In the case company, customer experience is measured with different methods, one example is asking the recommendation readiness from a customer. In order to improve the customer experience management, the case company should define, what the company means with customer experience and what kind of customer experience the company is aiming to create. After the encounter, the customer should be left with feelings of satisfaction, positivity and trust. The company should focus on easiness in its processes, on top of which the processes should work fluently. The customer experience management should be improved through systematic planning, and by combining and standardizing different measures. In addition, some channel-based measures should be used. The measurement conducted should be more customer focused, and the case company should form an understanding, which touch points are the most relevant to measure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing industry has been always facing challenge to improve the production efficiency, product quality, innovation ability and struggling to adopt cost-effective manufacturing system. In recent years cloud computing is emerging as one of the major enablers for the manufacturing industry. Combining the emerged cloud computing and other advanced manufacturing technologies such as Internet of Things, service-oriented architecture (SOA), networked manufacturing (NM) and manufacturing grid (MGrid), with existing manufacturing models and enterprise information technologies, a new paradigm called cloud manufacturing is proposed by the recent literature. This study presents concepts and ideas of cloud computing and cloud manufacturing. The concept, architecture, core enabling technologies, and typical characteristics of cloud manufacturing are discussed, as well as the difference and relationship between cloud computing and cloud manufacturing. The research is based on mixed qualitative and quantitative methods, and a case study. The case is a prototype of cloud manufacturing solution, which is software platform cooperated by ATR Soft Oy and SW Company China office. This study tries to understand the practical impacts and challenges that are derived from cloud manufacturing. The main conclusion of this study is that cloud manufacturing is an approach to achieve the transformation from traditional production-oriented manufacturing to next generation service-oriented manufacturing. Many manufacturing enterprises are already using a form of cloud computing in their existing network infrastructure to increase flexibility of its supply chain, reduce resources consumption, the study finds out the shift from cloud computing to cloud manufacturing is feasible. Meanwhile, the study points out the related theory, methodology and application of cloud manufacturing system are far from maturity, it is still an open field where many new technologies need to be studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines how content marketing is used in B2B customer acquisition and how content marketing performance measurement system is built and utilized in this context. Literature related to performance measurement, branding and buyer behavior is examined in the theoretical part in order to identify the elements influence on content marketing performance measurement design and usage. Qualitative case study is chosen in order to gain deep understanding of the phenomenon studied. The case company is a Finnish software vendor, which operates in B2B markets and has practiced content marketing for approximately two years. The in-depth interviews were conducted with three employees from marketing department. According to findings content marketing performance measurement system’s infrastructure is based on target market’s decision making processes, company’s own customer acquisition process, marketing automation tool and analytics solutions. The main roles of content marketing performance measurement system are measuring performance, strategy management and learning and improvement. Content marketing objectives in the context of customer acquisition are enhancing brand awareness, influencing brand attitude and lead generation. Both non-financial and financial outcomes are assessed by single phase specific metrics, phase specific overall KPIs and ratings related to lead’s involvement.