891 resultados para independent accountant
Resumo:
Logic programming systems which exploit and-parallelism among non-deterministic goals rely on notions of independence among those goals in order to ensure certain efficiency properties. "Non-strict" independence (NSI) is a more relaxed notion than the traditional notion of "strict" independence (SI) which still ensures the relevant efficiency properties and can allow considerable more parallelism than SI. However, all compilation technology developed to date has been based on SI, presumably because of the intrinsic complexity of exploiting NSI. This is related to the fact that NSI cannot be determined "a priori" as SI. This paper fills this gap by developing a technique for compile-time detection and annotation of NSI. It also proposes algorithms for combined compile- time/run-time detection, presenting novel run-time checks for this type of parallelism. Also, a transformation procedure to eliminate shared variables among parallel goals is presented, attempting to perform as much work as possible at compiletime. The approach is based on the knowledge of certain properties about run-time instantiations of program variables —sharing and freeness— for which compile-time technology is available, with new approaches being currently proposed.
Resumo:
This paper presents an approximation to the study of parallel systems using sequential tools. The Independent And-parallelism in Prolog is an example of parallel processing paradigm in the framework of logic programming, and implementations like
Resumo:
This paper presents an approximation to the study of parallel systems using sequential tools. The Independent And-parallelism in Prolog is an example of parallel processing paradigm in the framework of logic programming, and implementations like
Resumo:
Images acquired during free breathing using first-pass gadolinium-enhanced myocardial perfusion magnetic resonance imaging (MRI) exhibit a quasiperiodic motion pattern that needs to be compensated for if a further automatic analysis of the perfusion is to be executed. In this work, we present a method to compensate this movement by combining independent component analysis (ICA) and image registration: First, we use ICA and a time?frequency analysis to identify the motion and separate it from the intensity change induced by the contrast agent. Then, synthetic reference images are created by recombining all the independent components but the one related to the motion. Therefore, the resulting image series does not exhibit motion and its images have intensities similar to those of their original counterparts. Motion compensation is then achieved by using a multi-pass image registration procedure. We tested our method on 39 image series acquired from 13 patients, covering the basal, mid and apical areas of the left heart ventricle and consisting of 58 perfusion images each. We validated our method by comparing manually tracked intensity profiles of the myocardial sections to automatically generated ones before and after registration of 13 patient data sets (39 distinct slices). We compared linear, non-linear, and combined ICA based registration approaches and previously published motion compensation schemes. Considering run-time and accuracy, a two-step ICA based motion compensation scheme that first optimizes a translation and then for non-linear transformation performed best and achieves registration of the whole series in 32 ± 12 s on a recent workstation. The proposed scheme improves the Pearsons correlation coefficient between manually and automatically obtained time?intensity curves from .84 ± .19 before registration to .96 ± .06 after registration
Resumo:
Many mobile devices embed nowadays inertial sensors. This enables new forms of human-computer interaction through the use of gestures (movements performed with the mobile device) as a way of communication. This paper presents an accelerometer-based gesture recognition system for mobile devices which is able to recognize a collection of 10 different hand gestures. The system was conceived to be light and to operate in a user -independent manner in real time. The recognition system was implemented in a smart phone and evaluated through a collection of user tests, which showed a recognition accuracy similar to other state-of-the art techniques and a lower computational complexity. The system was also used to build a human -robot interface that enables controlling a wheeled robot with the gestures made with the mobile phone.
Resumo:
Definition and study of the innovative façade Natura, made of independent pre-vegetated and water storage type panels
Resumo:
We develop a simplified model of choked flow in pipes for CO2-water solutions as an important step in the modelling of a whole hydraulic system with the intention of eliminating the carbon dioxide generated in air-independent submarine propulsion. The model is based on an approximate fitting of the homogeneous isentropic solution upstream of a valve (or any other area restriction), for given fluid conditions at the entrance. The relative maximum choking back-pressure is computed as a function of area restriction ratio. Although the procedure is generic for gas solutions, numeric values for the non-dimensional parameters in the analysis are developed only for choking in the case of carbon dioxide solutions up to the pure-water limit.
Resumo:
Computation of Independent Sensitivities Using Maggi’s Formulation
Resumo:
Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and valuations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. Introduction Nowadays, any engineering calculation performed in the nuclear field should be accompanied by an uncertainty analysis. In such an analysis, different sources of uncertainties are taken into account. Works such as those performed under the UAM project (Ivanov, et al., 2013) treat nuclear data as a source of uncertainty, in particular cross-section data for which uncertainties given in the form of covariance matrices are already provided in the major nuclear data libraries. Meanwhile, fission yield uncertainties were often neglected or treated shallowly, because their effects were considered of second order compared to cross-sections (Garcia-Herranz, et al., 2010). However, the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
La condición física, o como mejor se la conoce hoy en día el “fitness”, es una variable que está cobrando gran protagonismo, especialmente desde la perspectiva de la salud. La mejora de la calidad de vida que se ha experimentado en los últimos años en las sociedades desarrolladas, conlleva un aumento de la esperanza de vida, lo que hace que cada vez más personas vivan más años. Este rápido crecimiento de la población mayor de 60 años hace que, un grupo poblacional prácticamente olvidado desde el punto de vista de la investigación científica en el campo de la actividad física y del deporte, cobre gran relevancia, con el fin de poder ayudar a alcanzar el dicho “no se trata de aportar años a la vida sino vida a lo años”. La presente memoria de Tesis Doctoral tiene como principal objetivo valorar los niveles de fitness en población mayor española, además de analizar la relación existente entre el fitness, sus condicionantes y otros aspectos de la salud, tales como la composición corporal y el estado cognitivo. Entendemos que para poder establecer futuras políticas de salud pública en relación a la actividad física y el envejecimiento activo es necesario conocer cuáles son los niveles de partida de la población mayor en España y sus condicionantes. El trabajo está basado en los datos del estudio multicéntrico EXERNET (Estudio Multi-céntrico para la Evaluación de los Niveles de Condición Física y su relación con Estilos de Vida Saludables en población mayor española no institucionalizada), así como en los datos de dos estudios, llevados a cabo en población mayor institucionalizada. Se han analizado un total de 3136 mayores de vida independiente, procedentes de 6 comunidades autónomas, y 153 mayores institucionalizados en residencias de la Comunidad de Madrid. Los principales resultados de esta tesis son los siguientes: a) Fueron establecidos los valores de referencia, así como las curvas de percentiles, para cada uno de los test de fitness, de acuerdo a la edad y al sexo, en población mayor española de vida independiente y no institucionalizada. b) Los varones obtuvieron mejores niveles de fitness que las mujeres, excepto en los test de flexibilidad; existe una tendencia a disminuir la condición física en ambos sexos a medida que la edad aumenta. c) Niveles bajos de fitness funcional fueron asociados con un aumento en la percepción de problemas. d) El nivel mínimo de fitness funcional a partir del cual los mayores perciben problemas en sus actividades de la vida diaria (AVD) es similar en ambos sexos. e) Niveles elevados de fitness fueron asociados con un menor riesgo de sufrir obesidad sarcopénica y con una mejor salud percibida en los mayores. f) Las personas mayores con obesidad sarcopénica tienen menor capacidad funcional que las personas mayores sanas. g) Niveles elevados de fuerza fueron asociados con un mejor estado cognitivo siendo el estado cognitivo la variable que más influye en el deterioro de la fuerza, incluso más que el sexo y la edad. ABSTRACT Fitness is a variable that is gaining in prominence, especially from the health perspective. Improvement of life quality that has been experienced in the last few years in developed countries, leads to an expanded life expectancy, increasing the numbers of people living longer. This population consisting of people of over 60 years, an almost forgotten population group from the point of view of scientific research in the field of physical activity and sport, is becoming increasingly important, with the main aim of helping to achieve the saying “do not only add years to life, but also add life to years”. The principal aim of the current thesis was to assess physical fitness levels in Spanish elderly people, of over 65 years, analyzing relationship between physical fitness, its determinants, and other aspects of health such as body composition and cognitive status. In order to establish further public health policies in relation to physical activity and active ageing it is necessary to identify the starting physical fitness levels of the Spanish population and their determinants. The work is based on data from the EXERNET multi-center study ("Multi-center Study for the Evaluation of Fitness levels and their relationship to Healthy Lifestyles in noninstitutionalized Spanish elderly"), and on data from two studies conducted in institutionalized elderly people: a total of 3136 non-institutionalized elderly, from 6 Regions of Spain, and 153 institutionalized elderly in nursing homes of Madrid. The main outcomes of this thesis are: a) sex- and age-specific physical fitness normative values and percentile curves for independent and non-institutionalized Spanish elderly were established. b) Greater physical fitness was present in the elderly men than in women, except for the flexibility test, and a trend toward decreased physical fitness in both sexes as their age increased. c) Lower levels of functional fitness were associated with increased perceived problems. d) The minimum functional fitness level at which older adults perceive problems in their ADLs, is similar for both sexes e) Higher levels of physical fitness were associated with a reduced risk of suffering sarcopenic obesity and better perceived health among the elderly. f) The elderly with sarcopenic obesity have lower physical functioning than healthy counterparts. g) Higher strength values were associated with better cognitive status with cognitive status being the most influencing variable in strength deterioration even more than sex and age.
Resumo:
Due to the intensive use of mobile phones for diferent purposes, these devices usually contain condential information which must not be accessed by another person apart from the owner of the device. Furthermore, the new generation phones commonly incorporate an accelerometer which may be used to capture the acceleration signals produced as a result of owner s gait. Nowadays, gait identication in basis of acceleration signals is being considered as a new biometric technique which allows blocking the device when another person is carrying it. Although distance based approaches as Euclidean distance or dynamic time warping have been applied to solve this identication problem, they show di±culties when dealing with gaits at diferent speeds. For this reason, in this paper, a method to extract an average template from instances of the gait at diferent velocities is presented. This method has been tested with the gait signals of 34 subjects while walking at diferent motion speeds (slow, normal and fast) and it has shown to improve the performance of Euclidean distance and classical dynamic time warping.
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
Adaptive Rejection Metropolis Sampling (ARMS) is a wellknown MCMC scheme for generating samples from onedimensional target distributions. ARMS is widely used within Gibbs sampling, where automatic and fast samplers are often needed to draw from univariate full-conditional densities. In this work, we propose an alternative adaptive algorithm (IA2RMS) that overcomes the main drawback of ARMS (an uncomplete adaptation of the proposal in some cases), speeding up the convergence of the chain to the target. Numerical results show that IA2RMS outperforms the standard ARMS, providing a correlation among samples close to zero.
Resumo:
In recent years, Independent Components Analysis (ICA) has proven itself to be a powerful signal-processing technique for solving the Blind-Source Separation (BSS) problems in different scientific domains. In the present work, an application of ICA for processing NIR hyperspectral images to detect traces of peanut in wheat flour is presented. Processing was performed without a priori knowledge of the chemical composition of the two food materials. The aim was to extract the source signals of the different chemical components from the initial data set and to use them in order to determine the distribution of peanut traces in the hyperspectral images. To determine the optimal number of independent component to be extracted, the Random ICA by blocks method was used. This method is based on the repeated calculation of several models using an increasing number of independent components after randomly segmenting the matrix data into two blocks and then calculating the correlations between the signals extracted from the two blocks. The extracted ICA signals were interpreted and their ability to classify peanut and wheat flour was studied. Finally, all the extracted ICs were used to construct a single synthetic signal that could be used directly with the hyperspectral images to enhance the contrast between the peanut and the wheat flours in a real multi-use industrial environment. Furthermore, feature extraction methods (connected components labelling algorithm followed by flood fill method to extract object contours) were applied in order to target the spatial location of the presence of peanut traces. A good visualization of the distributions of peanut traces was thus obtained