985 resultados para multiplier of convolution
Resumo:
Pós-graduação em Saúde Coletiva - FMB
Resumo:
The study has resulted from the desire to comprehend how intensive care unit (ICU) nurses understand the caregiving process. The ICU nurses must be able to promote effective changes in the care provided, to give attention to adversities and be able to act promptly to attend several demands. Aim: understanding the meaning to nurses of the caregiving process at the ICU. Methodology: it consists of a qualitative research with a phenomenological view that has three moments: description, reduction and comprehension. After approval by the Research Ethics Committee (211/08) in 02/06/2008, individual interviews were conducted by using the following guiding questions: What is the working process to ICU nurses? What is it to you, to be an ICU nurse? The study subjects were twelve nurses who worked at the ICUs. Results: the analysis showed the themes: nursing process, relationship with the ICU patient and family, and humanization. Conclusion: From the results it is concluded that nurses working in ICUs in the study report difficulties as well as satisfaction related to caregiving process, especially in the context of the anxieties of patients and families, revealing the difficulties in the processing of feelings. A nurse is recognized by the team as a leader agent and a multiplier of the caregiving actions.
Resumo:
Esta investigación analiza la capacidad del urbanismo de albergar sistemas emergentes de gestión urbana. El objetivo de esta tesis es describir estos procedimientos informales, realizando un exhaustivo recorrido a través de diversas experiencias de alteración del modo de gestión urbana al uso, para dar lugar a una sistematización de estos procesos que siente las bases para plantear nuevas formas de gestión urbana. Es decir, el fin no es tanto hacer una clasificación de las intervenciones más relevantes en materia urbana, sino descubrir su estructura, un orden para el desorden, una taxonomía en esas actuaciones que permita reformular las formas actuales de gestión de lo público. Se ha tratado de dibujar un mapa de oportunidades posibles en torno a la gestión del espacio urbano. A través de los procesos estudiados, esta investigación analiza los éxitos y fracasos obtenidos en los últimos cinco años de gestión alternativa de espacios vacantes, buscando extraer formulas susceptibles de incorporarse al modo actual de hacer ciudad. Este estudio pretende abordar un aspecto parcial, aunque fundamental del urbanismo, dando prioridad a lo público, sin la intención de negar el resto de formas del espacio urbano. Se concentra en los parámetros relativos a la interpretación de lo público: desatiende al conjunto urbano para descifrar ese fragmento. Así, el interés principal de esta tesis se concentra en la producción del espacio público. Al entender este ámbito como no exclusivo del urbanismo, otras actividades como el arte, la filosofía, la antropología o la sociología han aportado aspectos fundamentales dentro de esta investigación. De esta forma se ofrece un acercamiento poliédrico desde puntos de vista interdisciplinares, tratando de descubrir con rigor las posibilidades contrastadas de mejora del espacio urbano producido por los procesos emergentes de gestión informal de lo público. Mediante un metodología creada ex profeso (una estructura definida por tres líneas estratégicas: Gobernanza, Complejidad y Cohesión Social, que se desglosan en seis conceptos clave cada una) esta investigación analiza los aspectos relativos a la producción informal del espacio público de los sistemas emergentes. En relación con la Gobernanza, una parte de esta tesis se dedica al estudio de la informalidad, la transversalidad, la apropiación, la participación, la transparencia y las redes. La Complejidad se ha estudiado en su relación con la diversidad urbana, el uso sostenible del espacio público, la reactivación, la producción de espacio público, la ecología urbana y los procesos creativos. Para Cohesión Social se ha investigado la innovación en equipamientos, los procesos bottom-up, el rizoma, la identidad, los sistemas líquidos y el empoderamiento. Esta máquina para analizar se prueba en tres soportes urbanos: la escena urbana, los edificios-contenedores y los vacíos urbanos. Se han estudiado cinco situaciones urbanas en la ciudad de Madrid, seleccionados por ser las prácticas urbanas más relevantes de los últimos cinco años. Si bien este análisis nace de estos cincos casos, el modelo podría ser útil para otras investigaciones interesadas en los aspectos informales de la gestión urbana. Esta investigación se propone reclamar otras dimensiones más innovadoras que, si bien se escapan a los modos habituales de trabajar del urbanista, se revelan intensamente en los modos actuales de reclamar y producir espacio urbano. Se ha pretendido descubrir el espacio urbano como contenedor situacional del sentido público, potencial multiplicador de efectos funcionales como la gestión, la temporalidad, la participación, la rehabilitación, la multiplicidad de usos, la corrección del compacidad; efectos perceptuales y estéticos en la escena urbana; efectos políticos, de formación, de adquisición de poder; efectos sociales como la generación de nuevas redes, la creación de identidades. Se pretende que esta investigación arroje conclusiones que puedan tener aplicabilidad en la gestión y planificación del espacio público de ciudades y entornos urbanos inmersos a procesos semejantes. ABSTRACT This research examines the ability of urban planning to accommodate emerging urban management systems. The purpose of this thesis is to describe such informal procedures, by carrying out an exhaustive review across the different approaches to urban management, in order to systematize them as the basis for new ways of urban management. The aim is not, that is, to classify relevant urban interventions, but to recognize the intrinsic structure, an order for disorder, a taxonomy in those proceedings that could help reformulating existing forms of public management. The attempt has been to draw a map of possible opportunities around the management of urban space. Through the processes that have been the object of study, this research analyzes the successes and failures over the last five years of alternative management of vacant spaces, seeking to extract formulas that are likely to join the current way of making city. The focus of this urban study is the interpretation of public concepts, considered as key dimensions, not obliterating the other forms of urban space, but trying to make an in-depth analysis of this fragment within the global urban concept. Other activities intrinsically associated to public space generation, such as art, philosophy, anthropology or even sociology, are also contributing to this research. This interdisciplinary approach tries to enrich the research by recognizing the informal elements participating in emergent public management processes. The ex profeso devised methodology is structured by three main strategies: governance, complexity and Social Cohesiveness. Governance considers elements associated to informality, transversal dynamics, appropriation, participation, transparency and networking. Complexity encompasses elements of urban diversity, sustainable usage of public space, reactivation, urban space production, urban ecology and creative processes. Social Cohesiveness includes elements for equipment innovation, bottom-up processes, rhizome, identity, liquid systems and empowerment. This analyzing machine is tested on three urban scenarios: the urban scene, container buildings and empty spaces. Five relevant urban situations in Madrid have been selected, all of them developed in last five years, thus establishing the basis for other experiences of informal urban management. The result of the research, confirms the relevance of additional dimensions to traditional approaches for urban development. The urban space is recognized as a situational container of public sense and a potential multiplier of functional effects such as public management, temporality, participation, rehabilitation, multiplicity of uses, correction of compactness; perceptual and esthetic effects on the urban scene; politic effects, training, empowerment; social effects as generating new networks and identity creation. To sum up, the purpose of this thesis is to share the conclusions obtained by evaluating reference scenarios that could be applied in similar processes for cities and urban environment management and planning.
Resumo:
The calculation of the dose is one of the key steps in radiotherapy planning1-5. This calculation should be as accurate as possible, and over the years it became feasible through the implementation of new algorithms to calculate the dose on the treatment planning systems applied in radiotherapy. When a breast tumour is irradiated, it is fundamental a precise dose distribution to ensure the planning target volume (PTV) coverage and prevent skin complications. Some investigations, using breast cases, showed that the pencil beam convolution algorithm (PBC) overestimates the dose in the PTV and in the proximal region of the ipsilateral lung. However, underestimates the dose in the distal region of the ipsilateral lung, when compared with analytical anisotropic algorithm (AAA). With this study we aim to compare the performance in breast tumors of the PBC and AAA algorithms.
Resumo:
Multiplier analysis based upon the information contained in Leontief's inverse is undoubtedly part of the core of the input-output methodology and numerous applications an extensions have been developed that exploit its informational content. Nonetheless there are some implicit theoretical assumptions whose implications have perhaps not been fully assessed. This is the case of the 'excess capacity' assumption. Because of this assumption resources are available as needed to adjust production to new equilibrium states. In real world applications, however, new resources are scarce and costly. Supply constraints kick in and hence resource allocation needs to take them into account to really assess the effect of government policies. Using a closed general equilibrium model that incorporates supply constraints, we perform some simple numerical exercises and proceed to derive a 'constrained' multiplier matrix that can be compared with the standard 'unrestricted' multiplier matrix. Results show that the effectiveness of expenditure policies hinges critically on whether or not supply constraints are considered.
Resumo:
We discuss necessary as well as sufficient conditions for the second iterated local multiplier algebra of a separable C*-algebra to agree with the first.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
The study of the thermal behavior of complex packages as multichip modules (MCM¿s) is usually carried out by measuring the so-called thermal impedance response, that is: the transient temperature after a power step. From the analysis of this signal, the thermal frequency response can be estimated, and consequently, compact thermal models may be extracted. We present a method to obtain an estimate of the time constant distribution underlying the observed transient. The method is based on an iterative deconvolution that produces an approximation to the time constant spectrum while preserving a convenient convolution form. This method is applied to the obtained thermal response of a microstructure as analyzed by finite element method as well as to the measured thermal response of a transistor array integrated circuit (IC) in a SMD package.
Resumo:
Dose kernel convolution (DK) methods have been proposed to speed up absorbed dose calculations in molecular radionuclide therapy. Our aim was to evaluate the impact of tissue density heterogeneities (TDH) on dosimetry when using a DK method and to propose a simple density-correction method. METHODS: This study has been conducted on 3 clinical cases: case 1, non-Hodgkin lymphoma treated with (131)I-tositumomab; case 2, a neuroendocrine tumor treatment simulated with (177)Lu-peptides; and case 3, hepatocellular carcinoma treated with (90)Y-microspheres. Absorbed dose calculations were performed using a direct Monte Carlo approach accounting for TDH (3D-RD), and a DK approach (VoxelDose, or VD). For each individual voxel, the VD absorbed dose, D(VD), calculated assuming uniform density, was corrected for density, giving D(VDd). The average 3D-RD absorbed dose values, D(3DRD), were compared with D(VD) and D(VDd), using the relative difference Δ(VD/3DRD). At the voxel level, density-binned Δ(VD/3DRD) and Δ(VDd/3DRD) were plotted against ρ and fitted with a linear regression. RESULTS: The D(VD) calculations showed a good agreement with D(3DRD). Δ(VD/3DRD) was less than 3.5%, except for the tumor of case 1 (5.9%) and the renal cortex of case 2 (5.6%). At the voxel level, the Δ(VD/3DRD) range was 0%-14% for cases 1 and 2, and -3% to 7% for case 3. All 3 cases showed a linear relationship between voxel bin-averaged Δ(VD/3DRD) and density, ρ: case 1 (Δ = -0.56ρ + 0.62, R(2) = 0.93), case 2 (Δ = -0.91ρ + 0.96, R(2) = 0.99), and case 3 (Δ = -0.69ρ + 0.72, R(2) = 0.91). The density correction improved the agreement of the DK method with the Monte Carlo approach (Δ(VDd/3DRD) < 1.1%), but with a lesser extent for the tumor of case 1 (3.1%). At the voxel level, the Δ(VDd/3DRD) range decreased for the 3 clinical cases (case 1, -1% to 4%; case 2, -0.5% to 1.5%, and -1.5% to 2%). No more linear regression existed for cases 2 and 3, contrary to case 1 (Δ = 0.41ρ - 0.38, R(2) = 0.88) although the slope in case 1 was less pronounced. CONCLUSION: This study shows a small influence of TDH in the abdominal region for 3 representative clinical cases. A simple density-correction method was proposed and improved the comparison in the absorbed dose calculations when using our voxel S value implementation.
Resumo:
Anthropomorphic model observers are mathe- matical algorithms which are applied to images with the ultimate goal of predicting human signal detection and classification accuracy across varieties of backgrounds, image acquisitions and display conditions. A limitation of current channelized model observers is their inability to handle irregularly-shaped signals, which are common in clinical images, without a high number of directional channels. Here, we derive a new linear model observer based on convolution channels which we refer to as the "Filtered Channel observer" (FCO), as an extension of the channelized Hotelling observer (CHO) and the nonprewhitening with an eye filter (NPWE) observer. In analogy to the CHO, this linear model observer can take the form of a single template with an external noise term. To compare with human observers, we tested signals with irregular and asymmetrical shapes spanning the size of lesions down to those of microcalfications in 4-AFC breast tomosynthesis detection tasks, with three different contrasts for each case. Whereas humans uniformly outperformed conventional CHOs, the FCO observer outperformed humans for every signal with only one exception. Additive internal noise in the models allowed us to degrade model performance and match human performance. We could not match all the human performances with a model with a single internal noise component for all signal shape, size and contrast conditions. This suggests that either the internal noise might vary across signals or that the model cannot entirely capture the human detection strategy. However, the FCO model offers an efficient way to apprehend human observer performance for a non-symmetric signal.
Resumo:
Decimal multiplication is an integral part of financial, commercial, and internet-based computations. This paper presents a novel double digit decimal multiplication (DDDM) technique that performs 2 digit multiplications simultaneously in one clock cycle. This design offers low latency and high throughput. When multiplying two n-digit operands to produce a 2n-digit product, the design has a latency of (n / 2) 1 cycles. The paper presents area and delay comparisons for 7-digit, 16-digit, 34-digit double digit decimal multipliers on different families of Xilinx, Altera, Actel and Quick Logic FPGAs. The multipliers presented can be extended to support decimal floating-point multiplication for IEEE P754 standard
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved