802 resultados para Data-driven Methods
Resumo:
Purpose - The purpose of this paper is to apply lattice Boltzmann equation method (LBM) with multiple relaxation time (MRT) model, to investigate lid-driven flow in a three-dimensional (3D), rectangular cavity, and compare the results with flow in an equivalent two-dimensional (2D) cavity. Design/methodology/approach - The second-order MRT model is implemented in a 3D LBM code. The flow structure in cavities of different aspect ratios (0.25-4) and Reynolds numbers (0.01-1000) is investigated. The LBM simulation results are compared with those from numerical solution of Navier-Stokes (NS) equations and with available experimental data. Findings - The 3D simulations demonstrate that 2D models may predict the flow structure reasonably well at low Reynolds numbers, but significant differences with experimental data appear at high Reynolds numbers. Such discrepancy between 2D and 3D results are attributed to the effect of boundary layers near the side-walls in transverse direction (in 3D), due to which the vorticity in the core-region is weakened in general. Secondly, owing to the vortex stretching effect present in 3D flow, the vorticity in the transverse plane intensifies whereas that in the lateral plane decays, with increase in Reynolds number. However, on the symmetry-plane, the flow structure variation with respect to cavity aspect ratio is found to be qualitatively consistent with results of 2D simulations. Secondary flow vortices whose axis is in the direction of the lid-motion are observed; these are weak at low. Reynolds numbers, but become quite strong at high Reynolds numbers. Originality/value - The findings will be useful in the study of variety of enclosed fluid flows.
Resumo:
Increasing antimicrobial resistance in bacteria has led to the need for better understanding of antimicrobial usage patterns. In 1999, the World Organisation for Animal Health (OIE) recommended that an international ad hoc group should be established to address human and animal health risks related to antimicrobial resistance and the contribution of antimicrobial usage in veterinary medicine. In European countries the need for continuous recording of the usage of veterinary antimicrobials as well as for animal species-specific and indication-based data on usage has been acknowledged. Finland has been among the first countries to develop prudent use guidelines in veterinary medicine, as the Ministry of Agriculture and Forestry issued the first animal species-specific indication-based recommendations for antimicrobial use in animals in 1996. These guidelines have been revised in 2003 and 2009. However, surveillance on the species-specific use of antimicrobials in animals has not been performed in Finland. This thesis provides animal species-specific information on indication-based antimicrobial usage. Different methods for data collection have been utilized. Information on antimicrobial usage in animals has been gathered in four studies (studies A-D). Material from studies A, B and C have been used in an overlapping manner in the original publications I-IV. Study A (original publications I & IV) presents a retrospective cross-sectional survey on prescriptions for small animals at the Veterinary Teaching Hospital of the University of Helsinki. Prescriptions for antimicrobial agents (n = 2281) were collected and usage patterns, such as the indication and length of treatment, were reviewed. Most of the prescriptions were for dogs (78%), and primarily for the treatment of skin and ear infections most of which were treated with cephalexin for a median period of 14 days. Prescriptions for cats (18%) were most often for the treatment of urinary tract infections with amoxicillin for a median length of 10 days. Study B (original publication II) was a retrospective cross-sectional survey where prescriptions for animals were collected from 17 University Pharmacies nationwide. Antimicrobial prescriptions (n = 1038) for mainly dogs (65%) and cats (19%) were investigated. In this study, cephalexin and amoxicillin were also the most frequently used drugs for dogs and cats, respectively. In study C (original publications III & IV), the indication-based usage of antimicrobials of practicing veterinarians was analyzed by using a prospective questionnaire. Randomly selected practicing veterinarians in Finland (n = 262) recorded all their antimicrobial usage during a 7-day study period. Cattle (46%) with mastitis were the most common patients receiving antimicrobial treatment, generally intramuscular penicillin G or intramammary treatment with ampicillin and cloxacillin. The median length of treatment was four days, regardless of the route of administration. Antimicrobial use in horses was evaluated in study D, the results of which are previously unpublished. Firstly, data collected with the prospective questionnaire from the practicing veterinarians showed that horses (n = 89) were frequently treated for skin or wound infections by using penicillin G or trimethoprim-sulfadiazine. The mean duration of treatment was five to seven days. Secondly, according to retrospective data collected from patient records, horses (n = 74) that underwent colic surgery at the Veterinary Teaching Hospital of the University of Helsinki were generally treated according to national and hospital recommendations; penicillin G and gentamicin was administered preoperatively and treatment was continued for a median of three days postoperatively. In conclusion, Finnish veterinarians followed well the national prudent use guidelines. Narrow-spectrum antimicrobials were preferred and, for instance, fluoroquinolones were used sparingly. Prescription studies seemed to give good information on antimicrobials usage, especially when combined with complementary information from patient records. A prospective questionnaire study provided a fair amount of valuable data on several animal species. Electronic surveys are worthwhile exploiting in the future.
Resumo:
In the direction of arrival (DOA) estimation problem, we encounter both finite data and insufficient knowledge of array characterization. It is therefore important to study how subspace-based methods perform in such conditions. We analyze the finite data performance of the multiple signal classification (MUSIC) and minimum norm (min. norm) methods in the presence of sensor gain and phase errors, and derive expressions for the mean square error (MSE) in the DOA estimates. These expressions are first derived assuming an arbitrary array and then simplified for the special case of an uniform linear array with isotropic sensors. When they are further simplified for the case of finite data only and sensor errors only, they reduce to the recent results given in [9-12]. Computer simulations are used to verify the closeness between the predicted and simulated values of the MSE.
Resumo:
The method of stress characteristics has been employed to compute the end-bearing capacity of driven piles. The dependency of the soil internal friction angle on the stress level has been incorporated to achieve more realistic predictions for the end-bearing capacity of piles. The validity of the assumption of the superposition principle while using the bearing capacity equation based on soil plasticity concepts, when applied to deep foundations, has been examined. Fourteen pile case histories were compiled with cone penetration tests (CPT) performed in the vicinity of different pile locations. The end-bearing capacity of the piles was computed using different methods, namely, static analysis, effective stress approach, direct CPT, and the proposed approach. The comparison between predictions made by different methods and measured records shows that the stress-level-based method of stress characteristics compares better with experimental data. Finally, the end-bearing capacity of driven piles in sand was expressed in terms of a general expression with the addition of a new factor that accounts for different factors contributing to the bearing capacity. The influence of the soil nonassociative flow rule has also been included to achieve more realistic results.
Resumo:
NMR spectroscopy has witnessed tremendous advancements in recent years with the development of new methodologies for structure determination and availability of high-field strength spectrometers equipped with cryogenic probes. Supported by these advancements, a new dimension in NMR research has emerged which aims to increase the speed with data is collected and analyzed. Several novel methodologies have been proposed in this direction. This review focuses on the principles on which these different approaches are based with an emphasis on G-matrix Fourier transform NMR spectroscopy.
Resumo:
Many dynamical systems, including lakes, organisms, ocean circulation patterns, or financial markets, are now thought to have tipping points where critical transitions to a contrasting state can happen. Because critical transitions can occur unexpectedly and are difficult to manage, there is a need for methods that can be used to identify when a critical transition is approaching. Recent theory shows that we can identify the proximity of a system to a critical transition using a variety of so-called `early warning signals', and successful empirical examples suggest a potential for practical applicability. However, while the range of proposed methods for predicting critical transitions is rapidly expanding, opinions on their practical use differ widely, and there is no comparative study that tests the limitations of the different methods to identify approaching critical transitions using time-series data. Here, we summarize a range of currently available early warning methods and apply them to two simulated time series that are typical of systems undergoing a critical transition. In addition to a methodological guide, our work offers a practical toolbox that may be used in a wide range of fields to help detect early warning signals of critical transitions in time series data.
Resumo:
Online Social Networks (OSNs) facilitate to create and spread information easily and rapidly, influencing others to participate and propagandize. This work proposes a novel method of profiling Influential Blogger (IB) based on the activities performed on one's blog documents who influences various other bloggers in Social Blog Network (SBN). After constructing a social blogging site, a SBN is analyzed with appropriate parameters to get the Influential Blog Power (IBP) of each blogger in the network and demonstrate that profiling IB is adequate and accurate. The proposed Profiling Influential Blogger (PIB) Algorithm survival rate of IB is high and stable. (C) 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
Resumo:
The effectiveness of Oliver & Pharr's (O&P's) method, Cheng & Cheng's (C&C's) method, and a new method developed by our group for estimating Young's modulus and hardness based on instrumented indentation was evaluated for the case of yield stress to reduced Young's modulus ratio (sigma(y)/E-r) >= 4.55 x 10(-4) and hardening coefficient (n) <= 0.45. Dimensional theorem and finite element simulations were applied to produce reference results for this purpose. Both O&P's and C&C's methods overestimated the Young's modulus under some conditions, whereas the error can be controlled within +/- 16% if the formulation was modified with appropriate correction functions. Similar modification was not introduced to our method for determining Young's modulus, while the maximum error of results was around +/- 13%. The errors of hardness values obtained from all the three methods could be even larger and were irreducible with any correction scheme. It is therefore suggested that when hardness values of different materials are concerned, relative comparison of the data obtained from a single standard measurement technique would be more practically useful. It is noted that the ranges of error derived from the analysis could be different if different ranges of material parameters sigma(y)/E-r and n are considered.
Resumo:
DNA microarrays provide such a huge amount of data that unsupervised methods are required to reduce the dimension of the data set and to extract meaningful biological information. This work shows that Independent Component Analysis (ICA) is a promising approach for the analysis of genome-wide transcriptomic data. The paper first presents an overview of the most popular algorithms to perform ICA. These algorithms are then applied on a microarray breast-cancer data set. Some issues about the application of ICA and the evaluation of biological relevance of the results are discussed. This study indicates that ICA significantly outperforms Principal Component Analysis (PCA).
Resumo:
Electrically addressed silicon bulk acoustic wave microresonators offer high Q solutions for applications in sensing and signal processing. However, the electrically transduced motional signal is often swamped by parasitic feedthrough in hybrid technologies. With the aim of enhancing the ratio of the motional to feedthrough current at nominal operating voltages, this paper benchmarks a variety of drive and detection principles for electrostatically driven square-extensional mode resonators operating in air and in a foundry MEMS process utilizing 2μm gaps. A new detection technique, combining second harmonic capacitive actuation and piezoresistive detection, outperforms previously reported methods utilizing voltages as low as ± 3V in air providing a promising solution for low voltage CMOS-MEMS integration. ©2009 IEEE.
Resumo:
Data have been collected on fisheries catch and effort trends since the latter half of the 1800s. With current trends in declining stocks and stricter management regimes, data need to be collected and analyzed over shorter periods and at finer spatial resolution than in the past. New methods of electronic reporting may reduce the lag time in data collection and provide more accurate spatial resolution. In this study I evaluated the differences between fish dealer and vessel reporting systems for federal fisheries in the US New England and Mid-Atlantic areas. Using data on landing date, report date, gear used, port landed, number of hauls, number of fish sampled and species quotas from available catch and effort records I compared dealer and vessel electronically collected data against paper collected dealer and vessel data to determine if electronically collected data are timelier and more accurate. To determine if vessel or dealer electronic reporting is more useful for management, I determined differences in timeliness and accuracy between vessel and dealer electronic reports. I also compared the cost and efficiency of these new methods with less technology intensive reporting methods using available cost data and surveys of seafood dealers for cost information. Using this information I identified potentially unnecessary duplication of effort and identified applications in ecosystem-based fisheries management. This information can be used to guide the decisions of fisheries managers in the United States and other countries that are attempting to identify appropriate fisheries reporting methods for the management regimes under consideration. (PDF contains 370 pages)