16 resultados para DISCRETE-SCALE-INVARIANCE
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Introdução: A Motor Assessment Scale (MAS) tem mostrado ser um instrumento válido e fidedigno na avaliação do progresso clínico de indivíduos que sofreram um Acidente Vascular Cerebral (AVC). Objectivos: Traduzir e adaptar a MAS à realidade portuguesa e contribuir para a validação da versão portuguesa, avaliando a sua consistência interna. Metodologia: Após um processo de tradução, revisão por peritos, retroversão e comparação com a versão original, obteve-se a versão portuguesa da MAS. Procedeu-se a um estudo correlacional transversal para avaliação da consistência interna; a amostra final incluiu 30 sujeitos, 16 do sexo masculino e 14 do sexo feminino, com idades entre os 42 e 85 anos (média de 64±11,85 anos), com hemiparésia ou hemiplegia decorrente de AVC e que realizavam fisioterapia em um de 6 Hospitais seleccionados por conveniência; a média do tempo de diagnóstico foi de 306±1322,82 dias e do tempo de fisioterapia foi de 47±57,57 dias. Resultados: Obteve-se uma média de 24±14,51 pontos nas pontuações totais e um coeficiente de Alfa de Cronbach de 0,939, sem a exclusão de qualquer item; as correlações inter item variaram entre 0,395 e 0,916. Conclusões: Apesar da reduzida amostra e da sua heterogeneidade nas características e pontuações da escala, a Versão Portuguesa da MAS apresentou uma forte consistência interna, verificando-se que os itens estão, na sua maioria, muito correlacionados entre si, o que sustenta a adequação de cada item e apoia que, de forma geral, esta escala tem uma concepção lógica e estruturada.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
The importance of Social Responsibility (SR) is higher if this business variable is related with other ones of strategic nature in business activity (competitive success that the company achieved, performance that the firms develop and innovations that they carries out). The hypothesis is that organizations that focus on SR are those who get higher outputs and innovate more, achieving greater competitive success. A scale for measuring the orientation to SR has defined in order to determine the degree of relationship between above elements. This instrument is original because previous scales do not exist in the literature which could measure, on the one hand, the three classics sub-constructs theoretically accepted that SR is made up and, on the other hand, the relationship between SR and the other variables. As a result of causal relationships analysis we conclude with a scale of 21 indicators, validated scale with a sample of firms belonging to the Autonomous Community of Extremadura and it is the first empirical validation of these dimensions we know so far, in this context.
Resumo:
We consider a simple extension of the Standard Model by adding two Higgs triplets and a complex scalar singlet to its particle content. In this framework, the CP symmetry is spontaneously broken at high energies by the complex vacuum expectation value of the scalar singlet. Such a breaking leads to leptonic CP violation at low energies. The model also exhibits an A(4) X Z(4) flavor symmetry which, after being spontaneously broken at a high-energy scale, yields a tribimaximal pattern in the lepton sector. We consider small perturbations around the tribimaximal vacuum alignment condition in order to generate nonzero values of theta(13), as required by the latest neutrino oscillation data. It is shown that the value of theta(13) recently measured by the Daya Bay Reactor Neutrino Experiment can be accommodated in our framework together with large Dirac-type CP violation. We also address the viability of leptogenesis in our model through the out-of-equilibrium decays of the Higgs triplets. In particular, the CP asymmetries in the triplet decays into two leptons are computed and it is shown that the effective leptogenesis and low-energy CP-violating phases are directly linked.
Resumo:
We produce five flavour models for the lepton sector. All five models fit perfectly well - at the 1 sigma level - the existing data on the neutrino mass-squared differences and on the lepton mixing angles. The models are based on the type I seesaw mechanism, on a Z(2) symmetry for each lepton flavour, and either on a (spontaneously broken) symmetry under the interchange of two lepton flavours or on a (spontaneously broken) CP symmetry incorporating that interchange - or on both symmetries simultaneously. Each model makes definite predictions both for the scale of the neutrino masses and for the phase delta in lepton mixing; the fifth model also predicts a correlation between the lepton mixing angles theta(12) and theta(23).
Resumo:
Purpose - To develop and validate a psychometric scale for assessing image quality perception for chest X-ray images. Methods - Bandura's theory was used to guide scale development. A review of the literature was undertaken to identify items/factors which could be used to evaluate image quality using a perceptual approach. A draft scale was then created (22 items) and presented to a focus group (student and qualified radiographers). Within the focus group the draft scale was discussed and modified. A series of seven postero-anterior chest images were generated using a phantom with a range of image qualities. Image quality perception was confirmed for the seven images using signal-to-noise ratio (SNR 17.2–36.5). Participants (student and qualified radiographers and radiology trainees) were then invited to independently score each of the seven images using the draft image quality perception scale. Cronbach alpha was used to test interval reliability. Results - Fifty three participants used the scale to grade image quality perception on each of the seven images. Aggregated mean scale score increased with increasing SNR from 42.1 to 87.7 (r = 0.98, P < 0.001). For each of the 22 individual scale items there was clear differentiation of low, mid and high quality images. A Cronbach alpha coefficient of >0.7 was obtained across each of the seven images. Conclusion - This study represents the first development of a chest image quality perception scale based on Bandura's theory. There was excellent correlation between the image quality perception scores derived using the scale and the SNR. Further research will involve a more detailed item and factor analysis.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Química
Resumo:
Mestrado em Fisioterapia
Resumo:
We introduce the notions of equilibrium distribution and time of convergence in discrete non-autonomous graphs. Under some conditions we give an estimate to the convergence time to the equilibrium distribution using the second largest eigenvalue of some matrices associated with the system.
Resumo:
Purpose - To develop and validate a psychometric scale for assessing image quality for chest radiographs.
Resumo:
This work describes the utilization of Pulsed Electric Fields to control the protozoan contamination of a microalgae culture, in an industrial 2.7m3 microalgae photobioreactor. The contaminated culture was treated with Pulsed Electric Fields, PEF, for 6h with an average of 900V/cm, 65μs pulses of 50Hz. Working with recirculation, all the culture was uniformly exposed to the PEF throughout the assay. The development of the microalgae and protozoan populations was followed and the results showed that PEF is effective on the selective elimination of protozoa from microalgae cultures, inflicting on the protozoa growth halt, death or cell rupture, without affecting microalgae productivity. Specifically, the results show a reduction of the active protozoan population of 87% after 6h treatment and 100% after few days of normal cultivation regime. At the same time, microalgae growth rate remained unaffected. © 2014 Elsevier B.V.
Resumo:
For an interval map, the poles of the Artin-Mazur zeta function provide topological invariants which are closely connected to topological entropy. It is known that for a time-periodic nonautonomous dynamical system F with period p, the p-th power [zeta(F) (z)](p) of its zeta function is meromorphic in the unit disk. Unlike in the autonomous case, where the zeta function zeta(f)(z) only has poles in the unit disk, in the p-periodic nonautonomous case [zeta(F)(z)](p) may have zeros. In this paper we introduce the concept of spectral invariants of p-periodic nonautonomous discrete dynamical systems and study the role played by the zeros of [zeta(F)(z)](p) in this context. As we will see, these zeros play an important role in the spectral classification of these systems.
Resumo:
Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
Feature discretization (FD) techniques often yield adequate and compact representations of the data, suitable for machine learning and pattern recognition problems. These representations usually decrease the training time, yielding higher classification accuracy while allowing for humans to better understand and visualize the data, as compared to the use of the original features. This paper proposes two new FD techniques. The first one is based on the well-known Linde-Buzo-Gray quantization algorithm, coupled with a relevance criterion, being able perform unsupervised, supervised, or semi-supervised discretization. The second technique works in supervised mode, being based on the maximization of the mutual information between each discrete feature and the class label. Our experimental results on standard benchmark datasets show that these techniques scale up to high-dimensional data, attaining in many cases better accuracy than existing unsupervised and supervised FD approaches, while using fewer discretization intervals.
Resumo:
In machine learning and pattern recognition tasks, the use of feature discretization techniques may have several advantages. The discretized features may hold enough information for the learning task at hand, while ignoring minor fluctuations that are irrelevant or harmful for that task. The discretized features have more compact representations that may yield both better accuracy and lower training time, as compared to the use of the original features. However, in many cases, mainly with medium and high-dimensional data, the large number of features usually implies that there is some redundancy among them. Thus, we may further apply feature selection (FS) techniques on the discrete data, keeping the most relevant features, while discarding the irrelevant and redundant ones. In this paper, we propose relevance and redundancy criteria for supervised feature selection techniques on discrete data. These criteria are applied to the bin-class histograms of the discrete features. The experimental results, on public benchmark data, show that the proposed criteria can achieve better accuracy than widely used relevance and redundancy criteria, such as mutual information and the Fisher ratio.