905 resultados para Polarization-entangled photons
Resumo:
This is a deliberately contentious paper about the future of the socio-political sphere in the West based on what we know about its past. I argue that the predominant public discourse in Western countries is best characterised as one of selective forgetfulness; a semi-blissful, amnesiacal state of collective dementia that manifests itself in symbolic idealism: informationalism. Informationalism is merely the latest form of idealism. It is a lot like religion insofar as it causally relates abstract concepts with reality and, consequently, becomes confused between the two. Historically, this has proven to be a dangerous state of affairs, especially when elites becomes confused between ideas about how a society should work, and the way it actually does work. Central to the idealism of the information age, at least in intellectual spheres, is the so called "problem of the subject". I argue that the "problem of the subject" is a largely synthetic, destabilising, and ultimately fruitless theoretical abstraction which turns on a synthetically derived, generalised intradiscursive space; existentialist nihilism; and the theoretical baubles of ontological metaphysics. These philosophical aberrations are, in turn, historically concomitant with especially destructive political and social configurations. This paper sketches a theoretical framework for identity formation which rejects the problem of the subject, and proposes potential resources, sources, and strategies with which to engage the idealism that underpins this obfuscating problematic in an age of turbulent social uncertainty. Quite simply, I turn to history as the source of human identity. While informationalism, like religion, is mostly focused on utopian futures, I assert that history, not the future, holds the solutions for substantive problematics concerning individual and social identities. I argue here that history, language, thought, and identity are indissolubly entangled and so should be understood as such: they are the fundamental parts of 'identities in action'. From this perspective, the ‘problem of the subject’ becomes less a substantive intellectual problematic and more a theoretical red herring.
Resumo:
Poly(olefin sulfone)s, formed by the reaction of sulfur dioxide (SO2) and an olefin, are known to be highly susceptible to degradation by radiation and thus have been identified as candidate materials for chain scission-based extreme ultraviolet lithography (EUVL) resist materials. In order to investigate this further, the synthesis and characterisation of two poly(olefin sulfone)s namely poly(1-pentene sulfone) (PPS) and poly(2-methyl-1-pentene sulfone) (PMPS), was achieved and the two materials were evaluated for possible chain scission EUVL resist applications. It was found that both materials possess high sensitivities to EUV photons; however; the rates of outgassing were extremely high. The only observed degradation products were found to be SO2 and the respective olefin suggesting that depolymerisation takes place under irradiation in a vacuum environment. In addition to depolymerisation, a concurrent conversion of SO2 moieties to a sulfide phase was observed using XPS.
Resumo:
Some initial EUVL patterning results for polycarbonate based non-chemically amplified resists are presented. Without full optimization the developer a resolution of 60 nm line spaces could be obtained. With slight overexposure (1.4 × E0) 43.5 nm lines at a half pitch of 50 nm could be printed. At 2x E0 a 28.6 nm lines at a half pitch of 50 nm could be obtained with a LER that was just above expected for mask roughness. Upon being irradiated with EUV photons, these polymers undergo chain scission with the loss of carbon dioxide and carbon monoxide. The remaining photoproducts appear to be non-volatile under standard EUV irradiation conditions, but do exhibit increased solubility in developer compared to the unirradiated polymer. The sensitivity of the polymers to EUV light is related to their oxygen content and ways to increase the sensitivity of the polymers to 10 mJ cm-2 is discussed.
Resumo:
The hexagonal resonator characteristics of an individual ZnO-nanonail’s head were investigated via spatially resolved cathodoluminescence (CL) at room temperature. The positions of most of distinct CL peaks in visible range were well matched to those of whispering gallery modes (WGMs) of a hexagonal dielectric cavity when we took birefringence and dispersion of refractive indices into account. The broad and weak peaks for TE polarization in long wavelength range were consistent with refractive-index values below the threshold for total internal inflection. CL peaks that were not matched to WGMs were identified as either triangular quasi-WGM or Fabry–Pérot resonance modes.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
A wireless sensor network system must have the ability to tolerate harsh environmental conditions and reduce communication failures. In a typical outdoor situation, the presence of wind can introduce movement in the foliage. This motion of vegetation structures causes large and rapid signal fading in the communication link and must be accounted for when deploying a wireless sensor network system in such conditions. This thesis examines the fading characteristics experienced by wireless sensor nodes due to the effect of varying wind speed in a foliage obstructed transmission path. It presents extensive measurement campaigns at two locations with the approach of a typical wireless sensor networks configuration. The significance of this research lies in the varied approaches of its different experiments, involving a variety of vegetation types, scenarios and the use of different polarisations (vertical and horizontal). Non–line of sight (NLoS) scenario conditions investigate the wind effect based on different vegetation densities including that of the Acacia tree, Dogbane tree and tall grass. Whereas the line of sight (LoS) scenario investigates the effect of wind when the grass is swaying and affecting the ground-reflected component of the signal. Vegetation type and scenarios are envisaged to simulate real life working conditions of wireless sensor network systems in outdoor foliated environments. The results from the measurements are presented in statistical models involving first and second order statistics. We found that in most of the cases, the fading amplitude could be approximated by both Lognormal and Nakagami distribution, whose m parameter was found to depend on received power fluctuations. Lognormal distribution is known as the result of slow fading characteristics due to shadowing. This study concludes that fading caused by variations in received power due to wind in wireless sensor networks systems are found to be insignificant. There is no notable difference in Nakagami m values for low, calm, and windy wind speed categories. It is also shown in the second order analysis, the duration of the deep fades are very short, 0.1 second for 10 dB attenuation below RMS level for vertical polarization and 0.01 second for 10 dB attenuation below RMS level for horizontal polarization. Another key finding is that the received signal strength for horizontal polarisation demonstrates more than 3 dB better performances than the vertical polarisation for LoS and near LoS (thin vegetation) conditions and up to 10 dB better for denser vegetation conditions.
Resumo:
The nitrile imine-mediated tetrazole-ene cycloaddition reaction (NITEC) is introduced as a powerful and versatile conjugation tool to covalently ligate macromolecules onto variable (bio)surfaces. The NITEC approach is initiated by UV irradiation and proceeds rapidly at ambient temperature yielding a highly fluorescent linkage. Initially, the formation of block copolymers by the NITEC methodology is studied to evidence its efficacy as a macromolecular conjugation tool. The grafting of polymers onto inorganic (silicon) and bioorganic (cellulose) surfaces is subsequently carried out employing the optimized reaction conditions obtained from the macromolecular ligation experiments and evidenced by surface characterization techniques, including X-ray photoelectron spectroscopy and FT-IR microscopy. In addition, the patterned immobilization of variable polymer chains onto profluorescent cellulose is achieved through a simple masking process during the irradiation. Photoinduced nitrile imine-alkene 1,3-dipolar cycloaddition (NITEC) is employed to covalently bind well-defined polymers onto silicon oxide or cellulose. A diaryl tetrazole-functionalized molecule is grafted via silanization or amidification, respectively. Under UV light, a reactive nitrile imine rapidly forms and reacts with maleimide-functionalized polymers yielding a fluorescent linkage. Via a masking method, polymeric fluorescent patterns are achieved.
Resumo:
The intensity pulsations of a cw 1030 nm Yb:Phosphate monolithic waveguide laser with distributed feedback are described. We show that the pulsations could result from the coupling of the two orthogonal polarization modes through the two photon process of cooperative luminescence. The predictions of the presented theoretical model agree well with the observed behaviour.
Resumo:
The purpose of this study was to investigate the effect of very small air gaps (less than 1 mm) on the dosimetry of small photon fields used for stereotactic treatments. Measurements were performed with optically stimulated luminescent dosimeters (OSLDs) for 6 MV photons on a Varian 21iX linear accelerator with a Brainlab μMLC attachment for square field sizes down to 6 mm × 6 mm. Monte Carlo simulations were performed using EGSnrc C++ user code cavity. It was found that the Monte Carlo model used in this study accurately simulated the OSLD measurements on the linear accelerator. For the 6 mm field size, the 0.5 mm air gap upstream to the active area of the OSLD caused a 5.3 % dose reduction relative to a Monte Carlo simulation with no air gap. A hypothetical 0.2 mm air gap caused a dose reduction > 2 %, emphasizing the fact that even the tiniest air gaps can cause a large reduction in measured dose. The negligible effect on an 18 mm field size illustrated that the electronic disequilibrium caused by such small air gaps only affects the dosimetry of the very small fields. When performing small field dosimetry, care must be taken to avoid any air gaps, as can be often present when inserting detectors into solid phantoms. It is recommended that very small field dosimetry is performed in liquid water. When using small photon fields, sub-millimetre air gaps can also affect patient dosimetry if they cannot be spatially resolved on a CT scan. However the effect on the patient is debatable as the dose reduction caused by a 1 mm air gap, starting out at 19% in the first 0.1 mm behind the air gap, decreases to < 5 % after just 2 mm, and electronic equilibrium is fully re-established after just 5 mm.
Resumo:
Utilization of multiport-antennas represents an appropriate way for the mitigation of multi-path fading in wireless communication systems. However, to obtain low correlation between the signals from different antenna ports and to prevent gain reduction by cross-talk, large antenna elements spacing is expected. Polarization diversity allows signal separation even with small antenna spacing. Although it is effective, polarization diversity alone does not suffice once the number of antennas exceeds the number of orthogonal polarizations. This paper presents an approach which combines a novel array concept with the use of dual polarization. The theory is verified by a compact dual polarized patch antenna array, which consists of four elements and a decoupling network.
Resumo:
A year ago, I became aware of the historical existence of the group CERFI— Le centre d’etudes, de recherches, et de formation institutionelles, or The Study Center for Institutional Research and Formation. CERFI emerged in 1967 under the hand of Lacanian psychiatrist and Trotskyite activist Félix Guattari, whose antonymous journal Recherches chronicled the group’s subversive experiences, experiments, and government-sponsored urban projects. It was a singularly bizarre meeting of the French bureaucracy with militant activist groups, the French intelligentsia, and architectural and planning practitioners at the close of the ‘60s. Nevertheless, CERFI’s analysis of the problems of society was undertaken precisely from the perspective of the state, and the Institute acknowledged a “deep complicity between the intellectual and statesman ... because the first critics of the State, are officials themselves!”1 CERFI developed out of FGERI (The Federation of Groups for Institutional Study and Research), started by Guattari two years earlier. While FGERI was created for the analysis of mental institutions stemming from Guattari’s work at La Borde, an experimental psychiatric clinic, CERFI marks the group’s shift toward urbanism—to the interrogation of the city itself. Not only a platform for radical debate on architecture and the city, CERFI was a direct agent in the development of urban planning schemata for new towns in France. 2 CERFI’s founding members were Guattari, the economist and urban theorist François Fourquet, feminist philosopher Liane Mozère, and urban planner and editor of Multitides Anne Querrien—Guattari’s close friend and collaborator. The architects Antoine Grumback, Alain Fabre, Macary, and Janine Joutel were also members, as well as urbanists Bruno Fortier, Rainier Hoddé, and Christian de Portzamparc. 3 CERFI was the quintessential social project of post-‘68 French urbanism. Located on the Far Left and openly opposed to the Communist Party, this Trotskyist cooperative was able to achieve what other institutions, according to Fourquet, with their “customary devices—the politburo, central committee, and the basic cells—had failed to do.”4 The decentralized institute recognized that any formal integration of the group was to “sign its own death warrant; so it embraced a skein of directors, entangled, forming knots, liquidating all at once, and spinning in an unknown direction, stopping short and returning back to another node.” Allergic to the very idea of “party,” CERFI was a creative project of free, hybrid-aesthetic blocs talking and acting together, whose goal was none other than the “transformation of the libidinal economy of the militant revolutionary.” The group believed that by recognizing and affirming a “group unconscious,” as well as their individual unconscious desires, they would be able to avoid the political stalemates and splinter groups of the traditional Left. CERFI thus situated itself “on the side of psychosis”—its confessed goal was to serve rather than repress the utter madness of the urban malaise, because it was only from this mad perspective on the ground that a properly social discourse on the city could be forged.
Resumo:
Purpose: IpRGCs mediate non-image forming functions including photoentrainment and the pupil light reflex (PLR). Temporal summation increases visual sensitivity and decreases temporal resolution for image forming vision, but the summation properties of nonimage forming vision are unknown. We investigated the temporal summation of inner (ipRGC) and outer (rod/cone) retinal inputs to the PLR. Method: The consensual PLR of the left eye was measured in six participants with normal vision using a Maxwellian view infrared pupillometer. Temporal summation was investigated using a double-pulse protocol (100 ms stimulus pairs; 0–1024 ms inter-stimulus interval, ISI) presented to the dilated fellow right eye (Tropicamide 1%). Stimulus lights (blue λmax = 460 nm; red λmax = 638 nm) biased activity to inneror outer retinal inputs to non-image forming vision. Temporal summation was measured suprathreshold (15.2 log photons.cm−2.s−1 at the cornea) and subthreshold (11.4 log photons.cm−2.s−1 at the cornea). Results: RM-ANOVAs showed the suprathreshold and subthreshold 6 second post illumination pupil response (PIPR: expressed as percentage baseline diameter) did not significantly vary for red or blue stimuli (p > .05). The PIPR for a subthreshold red 16 ms double-pulse control condition did not significantly differ with ISI (p > .05). The maximum constriction amplitude for red and blue 100 ms double- pulse stimuli did not significantly vary with ISI (p > .05). Conclusion: The non-significant changes in suprathreshold PIPR and subthreshold maximum pupil constriction indicate that inner retinal ipRGC inputs and outer retinal photoreceptor inputs to the PLR do not show temporal summation. The results suggest a fundamental difference between the temporal summation characteristics of image forming and non-image forming vision.
Resumo:
Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. Previous research assumed a quantum-like model in which the semantic network was modelled as entangled qubits, however the level of activation was clearly being over-estimated. This paper explores three variations of this model, each of which are distinguished by a scaling factor designed to compensate the overestimation.
Resumo:
The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.
Resumo:
Dose kernels may be used to calculate dose distributions in radiotherapy (as described by Ahnesjo et al., 1999). Their calculation requires use of Monte Carlo methods, usually by forcing interactions to occur at a point. The Geant4 Monte Carlo toolkit provides a capability to force interactions to occur in a particular volume. We have modified this capability and created a Geant4 application to calculate dose kernels in cartesian, cylindrical, and spherical scoring systems. The simulation considers monoenergetic photons incident at the origin of a 3 m x 3 x 9 3 m water volume. Photons interact via compton, photo-electric, pair production, and rayleigh scattering. By default, Geant4 models photon interactions by sampling a physical interaction length (PIL) for each process. The process returning the smallest PIL is then considered to occur. In order to force the interaction to occur within a given length, L_FIL, we scale each PIL according to the formula: PIL_forced = L_FIL 9 (1 - exp(-PIL/PILo)) where PILo is a constant. This ensures that the process occurs within L_FIL, whilst correctly modelling the relative probability of each process. Dose kernels were produced for an incident photon energy of 0.1, 1.0, and 10.0 MeV. In order to benchmark the code, dose kernels were also calculated using the EGSnrc Edknrc user code. Identical scoring systems were used; namely, the collapsed cone approach of the Edknrc code. Relative dose difference images were then produced. Preliminary results demonstrate the ability of the Geant4 application to reproduce the shape of the dose kernels; median relative dose differences of 12.6, 5.75, and 12.6 % were found for an incident photon energy of 0.1, 1.0, and 10.0 MeV respectively.