979 resultados para Piecewise linear techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polynomial constraint solving plays a prominent role in several areas of hardware and software analysis and verification, e.g., termination proving, program invariant generation and hybrid system verification, to name a few. In this paper we propose a new method for solving non-linear constraints based on encoding the problem into an SMT problem considering only linear arithmetic. Unlike other existing methods, our method focuses on proving satisfiability of the constraints rather than on proving unsatisfiability, which is more relevant in several applications as we illustrate with several examples. Nevertheless, we also present new techniques based on the analysis of unsatisfiable cores that allow one to efficiently prove unsatisfiability too for a broad class of problems. The power of our approach is demonstrated by means of extensive experiments comparing our prototype with state-of-the-art tools on benchmarks taken both from the academic and the industrial world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with non-linear transformations for improving the performance of an entropy-based voice activity detector (VAD). The idea to use a non-linear transformation has already been applied in the field of speech linear prediction, or linear predictive coding (LPC), based on source separation techniques, where a score function is added to classical equations in order to take into account the true distribution of the signal. We explore the possibility of estimating the entropy of frames after calculating its score function, instead of using original frames. We observe that if the signal is clean, the estimated entropy is essentially the same; if the signal is noisy, however, the frames transformed using the score function may give entropy that is different in voiced frames as compared to nonvoiced ones. Experimental evidence is given to show that this fact enables voice activity detection under high noise, where the simple entropy method fails.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a method for blind deconvolution of linear channels based on source separation techniques, for real word signals. This technique applied to blind deconvolution problems is based in exploiting not the spatial independence between signals but the temporal independence between samples of the signal. Our objective is to minimize the mutual information between samples of the output in order to retrieve the original signal. In order to make use of use this idea the input signal must be a non-Gaussian i.i.d. signal. Because most real world signals do not have this i.i.d. nature, we will need to preprocess the original signal before the transmission into the channel. Likewise we should assure that the transmitted signal has non-Gaussian statistics in order to achieve the correct function of the algorithm. The strategy used for this preprocessing will be presented in this paper. If the receiver has the inverse of the preprocess, the original signal can be reconstructed without the convolutive distortion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, an advanced technique for the generation of deformation maps using synthetic aperture radar (SAR) data is presented. The algorithm estimates the linear and nonlinear components of the displacement, the error of the digital elevation model (DEM) used to cancel the topographic terms, and the atmospheric artifacts from a reduced set of low spatial resolution interferograms. The pixel candidates are selected from those presenting a good coherence level in the whole set of interferograms and the resulting nonuniform mesh tessellated with the Delauney triangulation to establish connections among them. The linear component of movement and DEM error are estimated adjusting a linear model to the data only on the connections. Later on, this information, once unwrapped to retrieve the absolute values, is used to calculate the nonlinear component of movement and atmospheric artifacts with alternate filtering techniques in both the temporal and spatial domains. The method presents high flexibility with respect to the required number of images and the baselines length. However, better results are obtained with large datasets of short baseline interferograms. The technique has been tested with European Remote Sensing SAR data from an area of Catalonia (Spain) and validated with on-field precise leveling measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results: Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions: Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes applications of cumulant analysis in speech processing. A special focus is made on different second-order statistics. A dominant role is played by an integral representation for cumulants by means of integrals involving cyclic products of kernels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we explore the use of non-linear transformations in order to improve the performance of an entropy based voice activity detector (VAD). The idea of using a non-linear transformation comes from some previous work done in speech linear prediction (LPC) field based in source separation techniques, where the score function was added into the classical equations in order to take into account the real distribution of the signal. We explore the possibility of estimating the entropy of frames after calculating its score function, instead of using original frames. We observe that if signal is clean, estimated entropy is essentially the same; but if signal is noisy transformed frames (with score function) are able to give different entropy if the frame is voiced against unvoiced ones. Experimental results show that this fact permits to detect voice activity under high noise, where simple entropy method fails.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two spectrophotometric methods are described for the simultaneous determination of ezetimibe (EZE) and simvastatin (SIM) in pharmaceutical preparations. The obtained data was evaluated by using two different chemometric techniques, Principal Component Regression (PCR) and Partial Least-Squares (PLS-1). In these techniques, the concentration data matrix was prepared by using the mixtures containing these drugs in methanol. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbances in the range of 240 - 300 nm in the intervals with Δλ = 1 nm at 61 wavelengths in their zero order spectra, then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of EZE and SIM in their mixture. The procedure did not require any separation step. The linear range was found to be 5 - 20 µg mL-1 for EZE and SIM in both methods. The accuracy and precision of the methods were assessed. These methods were successfully applied to a pharmaceutical preparation, tablet; and the results were compared with each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most difficult procedures in digestive-tract surgery is esophago-jejunal anastomosis following total gastrectomy. Cost/benefit analysis of this procedure justifies the use of mechanical staplers, in spite of their high cost. A technical variant of the side-to-side esophago-jejunal anastomosis is presented, which incorporates the use of a cutting linear stapler. Technical maneuvers are easy to perform, the cost of the cutting linear stapler is smaller than the circular ones, the amplitude of the anastomosis is wider and the likelihood of fistulae is smaller when compared to other techniques. The side-to-side esophago-jejunal anastomosis with the cutting linear stapler is always complemented by a manual suture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper describes an integrated micro/macro mechanical study of the elastic-viscoplastic behavior of unidirectional metal matrix composites (MMC). The micromechanical analysis of the elastic moduli is based on the Composites Cylinder Assemblage model (CCA) with comparisons also draw with a Representative Unit Cell (RUC) technique. These "homogenization" techniques are later incorporated into the Vanishing Fiber Diameter (VFD) model and a new formulation is proposed. The concept of a smeared element procedure is employed in conjunction with two different versions of the Bodner and Partom elastic-viscoplastic constitutive model for the associated macroscopic analysis. The formulations developed are also compared against experimental and analytical results available in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear alkylbenzenes, LAB, formed by the Alel3 or HF catalyzed alkylation of benzene are common raw materials for surfactant manufacture. Normally they are sulphonated using S03 or oleum to give the corresponding linear alkylbenzene sulphonates In >95 % yield. As concern has grown about the environmental impact of surfactants,' questions have been raised about the trace levels of unreacted raw materials, linear alkylbenzenes and minor impurities present in them. With the advent of modem analytical instruments and techniques, namely GCIMS, the opportunity has arisen to identify the exact nature of these impurities and to determine the actual levels of them present in the commercial linear ,alkylbenzenes. The object of the proposed study was to separate, identify and quantify major and minor components (1-10%) in commercial linear alkylbenzenes. The focus of this study was on the structure elucidation and determination of impurities and on the qualitative determination of them in all analyzed linear alkylbenzene samples. A gas chromatography/mass spectrometry, (GCIMS) study was performed o~ five samples from the same manufacturer (different production dates) and then it was followed by the analyses of ten commercial linear alkylbenzenes from four different suppliers. All the major components, namely linear alkylbenzene isomers, followed the same elution pattern with the 2-phenyl isomer eluting last. The individual isomers were identified by interpretation of their electron impact and chemical ionization mass spectra. The percent isomer distribution was found to be different from sample to sample. Average molecular weights were calculated using two methods, GC and GCIMS, and compared with the results reported on the Certificate of Analyses (C.O.A.) provided by the manufacturers of commercial linear alkylbenzenes. The GC results in most cases agreed with the reported values, whereas GC/MS results were significantly lower, between 0.41 and 3.29 amu. The minor components, impurities such as branched alkylbenzenes and dialkyltetralins eluted according to their molecular weights. Their fragmentation patterns were studied using electron impact ionization mode and their molecular weight ions confirmed by a 'soft ionization technique', chemical ionization. The level of impurities present i~ the analyzed commercial linear alkylbenzenes was expressed as the percent of the total sample weight, as well as, in mg/g. The percent of impurities was observed to vary between 4.5 % and 16.8 % with the highest being in sample "I". Quantitation (mg/g) of impurities such as branched alkylbenzenes and dialkyltetralins was done using cis/trans-l,4,6,7-tetramethyltetralin as an internal standard. Samples were analyzed using .GC/MS system operating under full scan and single ion monitoring data acquisition modes. The latter data acquisition mode, which offers higher sensitivity, was used to analyze all samples under investigation for presence of linear dialkyltetralins. Dialkyltetralins were reported quantitatively, whereas branched alkylbenzenes were reported semi-qualitatively. The GC/MS method that was developed during the course of this study allowed identification of some other trace impurities present in commercial LABs. Compounds such as non-linear dialkyltetralins, dialkylindanes, diphenylalkanes and alkylnaphthalenes were identified but their detailed structure elucidation and the quantitation was beyond the scope of this study. However, further investigation of these compounds will be the subject of a future study.