113 resultados para Blurring
Resumo:
El presente proyecto se propone como parte inicial de una investigación sobre la relación entre naturaleza/cultura/técnica. Tradicionalmente la naturaleza y la cultura se han considerado como ámbitos diferenciados y opuestos. Y es en esta distinción donde la técnica adquiere un lugar central. El pensamiento occidental sobre la técnica ha recibido diversas interpretaciones: desde una subordinación con respecto al conocimiento verdadero (episteme) en la filosofía clásica, un optimismo sobre la técnica como posibilidad de dominación de la naturaleza en el Renacimiento y la Ilustración, y la ambigüedad y desasosiego romántico (Mitcham, 1979). Durante el siglo XX se distinguen dos posiciones antagónicas sobre la técnica. Por un lado, una actitud “crítica” donde pueden identificarse los trabajos de filósofs de diferentes tradiciones como Ortega y Gasset (1939), Heidegger (1954), Mumford (1971) Ellul (1960) y la Escuela de Frankfurt. Por otro lado, una filosofía de la técnica “ingenieril” que consiste en el análisis de la tecnología como un paradigma de pensamiento y acción humana. Esta dicotomía ha sido interpretada por Eco como “apocalípticos e integrados”. Más allá de las mencionadas diferencias, lo que tienen en común ambas posiciones es que parten de una dicotomía entre cultura y naturaleza. Nuestra perspectiva rechaza esta dicotomía, por el contrario, evidenciamos una creciente imbricación entre ambas donde las fronteras entre una y otra se hacen difusas. La noción de “objeto técnico” propuesta por Simondon (2007) hace referencia a la inserción del objeto técnico en la cultura, donde debe reconocerse la “realidad humana” presente en el mismo. Ahora bien, esto no significa “humanizar el objeto técnico”, sino más bien indagar sobre el lugar que este ocupa en la cultura como también establecer su relación con la naturaleza. En el siglo XVII el hombre mismo es reinterpretado como máquina (La Mettrie, 2000). En la actualidad pueden identificarse dos tendencias en la concepción de la técnica: los «humanos-máquinas» y las «máquinas-humanas», en otras palabras, la disposición del humano hacia la máquina y la tendencia de la máquina hacia lo humano. No obstante, ambas posiciones siguen manteniendo una distinción taxonómica entre el cuerpo –o lo orgánico- y lo maquínico, lo que implica una consideración de esta relación de manera extrínseca. Frente a esta tensión Haraway propone el concepto de cyborg: «un organismo cibernético» (1995). Los desarrollos tecnológicos han producido una modificación tal en la vida de los seres orgánicos en los cuales ya no puede concebirse su cuerpo independientemente de la tecnología. Esto conduce a replantear la distinción entre “animales/hombres/máquinas”, entendiendo a los mismos como expresiones de naturaleza, cultura y tecnología respectivamente. Nuestra investigación parte de la hipótesis que la técnica diluye diferencias de orden natural y cultural a través de los objetos técnicos que son productos culturales. La estética se ocupa de la percepción sensible del mundo no puede eludir su dimensión técnica. Al margen de la crítica a la “Industria cultural” consideramos relevante la aproximación de Benjamin al problema de la técnica porque aborda la imbricación antes mencionada en el campo de la percepción. Según Benjamin la irrupción de la técnica al mismo tiempo que posibilita una estetización de la política que confluye en el fascismo como punto extremo también abre la posibilidad de desmontar la ideología del progreso infinito (1967). Una integración entre aproximaciones estéticas y políticas a la técnica Flusser (1983) propone la “caja negra” como metáfora de la técnica contemporánea. Su propuesta es la “apertura de la caja negra” que consiste en tomar conocimiento del funcionamiento del dispositivo. Nuestra propuesta de investigación aborda la técnica desde una consideración filosófica/estética/política, donde redefiniremos la técnica partiendo de la imbricación entre cultura y naturaleza. This project will set the basis for a sustained research on the relation nature/culture/technique. They have been traditionally considered as separate and even opposite fields. And it is on the brink of this distinction where technique plays a central role. In Western thought technique has received many interpretations since the beginnings of philosophy: from a subordination to true knowledge (episteme) in classic philosophy, or the optimism which sees in technique the possibility of dominating nature in the Renaissance and in the Enlightenment, to the Romantic ambiguity and uneasiness towards technological change (Mitcham, 1979). During the twentieth century two opposed approach on technique prevail. On one hand, a “critical” attitude such defines the work of philosophers of different traditions such as Ortega y Gasset (1939), Heidegger (1954), Mumford (1971) Ellul (1960) and the Frankfurt School. On the other hand there is an “engineering” philosophy of technique that consists in the analisis of technology as a paradigm to understand human action and thought. Besides their differences, both positions have in common a dichotomy between nature and culture. We reject such dichotomy. On the contrary we consider there is a growing intertwinement between both which blurs the borders of the concepts. Simondon’s notion of “technical object” refers to the insertion of the technique in culture where the “human reality” in it must be recognised. This does not imply “humanising the technical object”, but investigate on the role it plays on culture and establishing its relation to nature. To articulate this relation we will work with unorthodox approaches on technique such as Benjamin (1967), Flusser (1983) and others. The hypothesis of our project is that the traditional distinction of “animal/man/machine” must be re-thought, therefore raising the question on the blurring line between nature, culture and technique and its effects in philosophy, politics and aesthetics.
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
Overdiagnosis is the diagnosis of an abnormality that is not associated with a substantial health hazard and that patients have no benefit to be aware of. It is neither a misdiagnosis (diagnostic error), nor a false positive result (positive test in the absence of a real abnormality). It mainly results from screening, use of increasingly sensitive diagnostic tests, incidental findings on routine examinations, and widening diagnostic criteria to define a condition requiring an intervention. The blurring boundaries between risk and disease, physicians' fear of missing a diagnosis and patients' need for reassurance are further causes of overdiagnosis. Overdiagnosis often implies procedures to confirm or exclude the presence of the condition and is by definition associated with useless treatments and interventions, generating harm and costs without any benefit. Overdiagnosis also diverts healthcare professionals from caring about other health issues. Preventing overdiagnosis requires increasing awareness of healthcare professionals and patients about its occurrence, the avoidance of unnecessary and untargeted diagnostic tests, and the avoidance of screening without demonstrated benefits. Furthermore, accounting systematically for the harms and benefits of screening and diagnostic tests and determining risk factor thresholds based on the expected absolute risk reduction would also help prevent overdiagnosis.
Resumo:
Identification is ever more important in the online world, and identity-related crime is a growing problem related to this. This new category of crime is not restricted to high-profile instances of identity 'theft' or identity fraud; it is wide-ranging and complex, ranging from identity deletion to unlawful identity creation and identity 'theft'. Commonly accepted definitions are lacking, thus blurring available statistics, and policies to combat this new crime are piecemeal at best. To assess the real nature and magnitude of identity-related crime, and to be able to discuss how it can be combated, identity-related crime should be understood in all its aspects. As a first key step, this article introduces a typology of identity-related crime, consisting of conceptual, technical and legal categories, that can be used as a comprehensive framework for future research, countermeasures and policies related to identity related crime.
Resumo:
MRI has evolved into an important diagnostic technique in medical imaging. However, reliability of the derived diagnosis can be degraded by artifacts, which challenge both radiologists and automatic computer-aided diagnosis. This work proposes a fully-automatic method for measuring image quality of three-dimensional (3D) structural MRI. Quality measures are derived by analyzing the air background of magnitude images and are capable of detecting image degradation from several sources, including bulk motion, residual magnetization from incomplete spoiling, blurring, and ghosting. The method has been validated on 749 3D T(1)-weighted 1.5T and 3T head scans acquired at 36 Alzheimer's Disease Neuroimaging Initiative (ADNI) study sites operating with various software and hardware combinations. Results are compared against qualitative grades assigned by the ADNI quality control center (taken as the reference standard). The derived quality indices are independent of the MRI system used and agree with the reference standard quality ratings with high sensitivity and specificity (>85%). The proposed procedures for quality assessment could be of great value for both research and routine clinical imaging. It could greatly improve workflow through its ability to rule out the need for a repeat scan while the patient is still in the magnet bore.
Resumo:
Image filtering is a highly demanded approach of image enhancement in digital imaging systems design. It is widely used in television and camera design technologies to improve the quality of an output image to avoid various problems such as image blurring problem thatgains importance in design of displays of large sizes and design of digital cameras. This thesis proposes a new image filtering method basedon visual characteristics of human eye such as MTF. In contrast to the traditional filtering methods based on human visual characteristics this thesis takes into account the anisotropy of the human eye vision. The proposed method is based on laboratory measurements of the human eye MTF and takes into account degradation of the image by the latter. This method improves an image in the way it will be degraded by human eye MTF to give perception of the original image quality. This thesis gives a basic understanding of an image filtering approach and the concept of MTF and describes an algorithm to perform an image enhancement based on MTF of human eye. Performed experiments have shown quite good results according to human evaluation. Suggestions to improve the algorithm are also given for the future improvements.
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
The present article contributes to the ongoing academic debate on migrants' appropriation of artistic and political spaces in Germany. Cologne, one of the largest cities in Germany, is an interesting example of the tension between political discourse centred around multiculturalism and cultural segregation processes. The 'no fool is illegal' carnival organised by asylum seekers shows their capacity to act, as they reinvent an old local tradition by reinterpreting medieval rituals. Today, different groups and associations appropriate this festive art space: migrants, gays and lesbians, feminists and far-left groups either organise their own parties or take part in the official parties and parades as separate groups. As a result, the celebration of diversity figures on the local political agenda and becomes part of the official carnival festivities. This leads to a blurring of boundaries, whereby mainstream popular culture becomes more and more influenced by multicultural elements.
Resumo:
The present article contributes to the ongoing academic debate on migrants' appropriation of artistic and political spaces in Germany. Cologne, one of the largest cities in Germany, is an interesting example of the tension between political discourse centred around multiculturalism and cultural segregation processes. The 'no fool is illegal' carnival organised by asylum seekers shows their capacity to act, as they reinvent an old local tradition by reinterpreting medieval rituals. Today, different groups and associations appropriate this festive art space: migrants, gays and lesbians, feminists and far-left groups either organise their own parties or take part in the official parties and parades as separate groups. As a result, the celebration of diversity figures on the local political agenda and becomes part of the official carnival festivities. This leads to a blurring of boundaries, whereby mainstream popular culture becomes more and more influenced by multicultural elements.
Resumo:
Objective To evaluate the sonographic measurement of subcutaneous and visceral fat in correlation with the grade of hepatic steatosis. Materials and Methods In the period from October 2012 to January 2013, 365 patients were evaluated. The subcutaneous and visceral fat thicknesses were measured with a convex, 3–4 MHz transducer transversely placed 1 cm above the umbilical scar. The distance between the internal aspect of the abdominal rectus muscle and the posterior aortic wall in the abdominal midline was considered for measurement of the visceral fat. Increased liver echogenicity, blurring of vascular margins and increased acoustic attenuation were the parameters considered in the quantification of hepatic steatosis. Results Steatosis was found in 38% of the study sample. In the detection of moderate to severe steatosis, the area under the ROC curve was 0.96 for women and 0.99 for men, indicating cut-off values for visceral fat thickness of 9 cm and 10 cm, respectively. Conclusion The present study evidenced the correlation between steatosis and visceral fat thickness and suggested values for visceral fat thickness to allow the differentiation of normality from risk for steatohepatitis.
Resumo:
In the network era, creative achievements like innovations are more and more often created in interaction among different actors. The complexity of today‘s problems transcends the individual human mind, requiring not only individual but also collective creativity. In collective creativity, it is impossible to trace the source of new ideas to an individual. Instead, creative activity emerges from the collaboration and contribution of many individuals, thereby blurring the contribution of specific individuals in creating ideas. Collective creativity is often associated with diversity of knowledge, skills, experiences and perspectives. Collaboration between diverse actors thus triggers creativity and gives possibilities for collective creativity. This dissertation investigates collective creativity in the context of practice-based innovation. Practice-based innovation processes are triggered by problem setting in a practical context and conducted in non-linear processes utilising scientific and practical knowledge production and creation in cross-disciplinary innovation networks. In these networks diversity or distances between innovation actors are essential. Innovation potential may be found in exploiting different kinds of distances. This dissertation presents different kinds of distances, such as cognitive, functional and organisational which could be considered as sources of creativity and thus innovation. However, formation and functioning of these kinds of innovation networks can be problematic. Distances between innovating actors may be so great that a special interpretation function is needed – that is, brokerage. This dissertation defines factors that enhance collective creativity in practice-based innovation and especially in the fuzzy front end phase of innovation processes. The first objective of this dissertation is to study individual and collective creativity at the employee level and identify those factors that support individual and collective creativity in the organisation. The second objective is to study how organisations use external knowledge to support collective creativity in their innovation processes in open multi-actor innovation. The third objective is to define how brokerage functions create possibilities for collective creativity especially in the context of practice-based innovation. The research objectives have been studied through five substudies using a case-study strategy. Each substudy highlights various aspects of creativity and collective creativity. The empirical data consist of materials from innovation projects arranged in the Lahti region, Finland, or materials from the development of innovation methods in the Lahti region. The Lahti region has been chosen as the research context because the innovation policy of the region emphasises especially the promotion of practice-based innovations. The results of this dissertation indicate that all possibilities of collective creativity are not utilised in internal operations of organisations. The dissertation introduces several factors that could support collective creativity in organisations. However, creativity as a social construct is understood and experienced differently in different organisations, and these differences should be taken into account when supporting creativity in organisations. The increasing complexity of most potential innovations requires collaborative creative efforts that often exceed the boundaries of the organisation and call for the involvement of external expertise. In practice-based innovation different distances are considered as sources of creativity. This dissertation gives practical implications on how it is possible to exploit different kinds of distances knowingly. It underlines especially the importance of brokerage functions in open, practice-based innovation in order to create possibilities for collective creativity. As a contribution of this dissertation, a model of brokerage functions in practice-based innovation is formulated. According to the model, the results and success of brokerage functions are based on the context of brokerage as well as the roles, tasks, skills and capabilities of brokers. The brokerage functions in practice-based innovation are also possible to divide into social and cognitive brokerage.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.
Resumo:
Confocal and two-photon microcopy have become essential tools in biological research and today many investigations are not possible without their help. The valuable advantage that these two techniques offer is the ability of optical sectioning. Optical sectioning makes it possible to obtain 3D visuahzation of the structiu-es, and hence, valuable information of the structural relationships, the geometrical, and the morphological aspects of the specimen. The achievable lateral and axial resolutions by confocal and two-photon microscopy, similar to other optical imaging systems, are both defined by the diffraction theorem. Any aberration and imperfection present during the imaging results in broadening of the calculated theoretical resolution, blurring, geometrical distortions in the acquired images that interfere with the analysis of the structures, and lower the collected fluorescence from the specimen. The aberrations may have different causes and they can be classified by their sources such as specimen-induced aberrations, optics-induced aberrations, illumination aberrations, and misalignment aberrations. This thesis presents an investigation and study of image enhancement. The goal of this thesis was approached in two different directions. Initially, we investigated the sources of the imperfections. We propose methods to eliminate or minimize aberrations introduced during the image acquisition by optimizing the acquisition conditions. The impact on the resolution as a result of using a coverslip the thickness of which is mismatched with the one that the objective lens is designed for was shown and a novel technique was introduced in order to define the proper value on the correction collar of the lens. The amoimt of spherical aberration with regard to t he numerical aperture of the objective lens was investigated and it was shown that, based on the purpose of our imaging tasks, different numerical apertures must be used. The deformed beam cross section of the single-photon excitation source was corrected and the enhancement of the resolution and image quaUty was shown. Furthermore, the dependency of the scattered light on the excitation wavelength was shown empirically. In the second part, we continued the study of the image enhancement process by deconvolution techniques. Although deconvolution algorithms are used widely to improve the quality of the images, how well a deconvolution algorithm responds highly depends on the point spread function (PSF) of the imaging system applied to the algorithm and the level of its accuracy. We investigated approaches that can be done in order to obtain more precise PSF. Novel methods to improve the pattern of the PSF and reduce the noise are proposed. Furthermore, multiple soiu'ces to extract the PSFs of the imaging system are introduced and the empirical deconvolution results by using each of these PSFs are compared together. The results confirm that a greater improvement attained by applying the in situ PSF during the deconvolution process.
Resumo:
Les nanosciences et les nanotechnologies (NST) s’inscrivent dans un champ technoscientifique, le nanomonde, qui a pour socle l’hybridation autant conceptuelle que pratique entre le domaine de la nature et celui de la technique. Dans ce mémoire nous nous intéressons au basculement de la distinction entre le naturel et l’artificiel qui s’ensuit. Un retour socio-historique sur la construction du dualisme nature/artifice propre aux sociétés modernes nous aide alors à saisir les enjeux socio-culturels impliqués dans sa remise en question. La déconstruction, à travers la méthode d’analyse de discours, d’entretiens réalisés avec les principaux chercheurs en NST au Québec illustre empiriquement, tout en le systématisant, le double processus d’artificialisation de la nature et de naturalisation de la technique, pointé théoriquement comme caractéristique de la remise en cause de la distinction entre nature et artifice qu’opère le nanomonde. Nous suggérons que l’artificialisation de la nature et la naturalisation de la technique, loin d’être contradictoires, constituent des éléments d’une dynamique synergique dont le résultat est une désontologisation de la nature comme catégorie de la pensée et une déqualification du monde qui distingue l’activité humaine.