951 resultados para Kim, Jaegwon: Physicalism, or something near enough
Resumo:
Crossing the Franco-Swiss border, the Large Hadron Collider (LHC), designed to collide 7 TeV proton beams, is the world's largest and most powerful particle accelerator the operation of which was originally intended to commence in 2008. Unfortunately, due to an interconnect discontinuity in one of the main dipole circuit's 13 kA superconducting busbars, a catastrophic quench event occurred during initial magnet training, causing significant physical system damage. Furthermore, investigation into the cause found that such discontinuities were not only present in the circuit in question, but throughout the entire LHC. This prevented further magnet training and ultimately resulted in the maximum sustainable beam energy being limited to approximately half that of the design nominal, 3.5-4 TeV, for the first three years of operation (Run 1, 2009-2012) and a major consolidation campaign being scheduled for the first long shutdown (LS 1, 2012-2014). Throughout Run 1, a series of studies attempted to predict the amount of post-installation training quenches still required to qualify each circuit to nominal-energy current levels. With predictions in excess of 80 quenches (each having a recovery time of 8-12+ hours) just to achieve 6.5 TeV and close to 1000 quenches for 7 TeV, it was decided that for Run 2, all systems be at least qualified for 6.5 TeV operation. However, even with all interconnect discontinuities scheduled to be repaired during LS 1, numerous other concerns regarding circuit stability arose. In particular, observations of an erratic behaviour of magnet bypass diodes and the degradation of other potentially weak busbar sections, as well as observations of seemingly random millisecond spikes in beam losses, known as unidentified falling object (UFO) events, which, if persist at 6.5 TeV, may eventually deposit sufficient energy to quench adjacent magnets. In light of the above, the thesis hypothesis states that, even with the observed issues, the LHC main dipole circuits can safely support and sustain near-nominal proton beam energies of at least 6.5 TeV. Research into minimising the risk of magnet training led to the development and implementation of a new qualification method, capable of providing conclusive evidence that all aspects of all circuits, other than the magnets and their internal joints, can safely withstand a quench event at near-nominal current levels, allowing for magnet training to be carried out both systematically and without risk. This method has become known as the Copper Stabiliser Continuity Measurement (CSCM). Results were a success, with all circuits eventually being subject to a full current decay from 6.5 TeV equivalent current levels, with no measurable damage occurring. Research into UFO events led to the development of a numerical model capable of simulating typical UFO events, reproducing entire Run 1 measured event data sets and extrapolating to 6.5 TeV, predicting the likelihood of UFO-induced magnet quenches. Results provided interesting insights into the involved phenomena as well as confirming the possibility of UFO-induced magnet quenches. The model was also capable of predicting that such events, if left unaccounted for, are likely to be commonplace or not, resulting in significant long-term issues for 6.5+ TeV operation. Addressing the thesis hypothesis, the following written works detail the development and results of all CSCM qualification tests and subsequent magnet training as well as the development and simulation results of both 4 TeV and 6.5 TeV UFO event modelling. The thesis concludes, post-LS 1, with the LHC successfully sustaining 6.5 TeV proton beams, but with UFO events, as predicted, resulting in otherwise uninitiated magnet quenches and being at the forefront of system availability issues.
Resumo:
In the standard Vehicle Routing Problem (VRP), we route a fleet of vehicles to deliver the demands of all customers such that the total distance traveled by the fleet is minimized. In this dissertation, we study variants of the VRP that minimize the completion time, i.e., we minimize the distance of the longest route. We call it the min-max objective function. In applications such as disaster relief efforts and military operations, the objective is often to finish the delivery or the task as soon as possible, not to plan routes with the minimum total distance. Even in commercial package delivery nowadays, companies are investing in new technologies to speed up delivery instead of focusing merely on the min-sum objective. In this dissertation, we compare the min-max and the standard (min-sum) objective functions in a worst-case analysis to show that the optimal solution with respect to one objective function can be very poor with respect to the other. The results motivate the design of algorithms specifically for the min-max objective. We study variants of min-max VRPs including one problem from the literature (the min-max Multi-Depot VRP) and two new problems (the min-max Split Delivery Multi-Depot VRP with Minimum Service Requirement and the min-max Close-Enough VRP). We develop heuristics to solve these three problems. We compare the results produced by our heuristics to the best-known solutions in the literature and find that our algorithms are effective. In the case where benchmark instances are not available, we generate instances whose near-optimal solutions can be estimated based on geometry. We formulate the Vehicle Routing Problem with Drones and carry out a theoretical analysis to show the maximum benefit from using drones in addition to trucks to reduce delivery time. The speed-up ratio depends on the number of drones loaded onto one truck and the speed of the drone relative to the speed of the truck.
Resumo:
We present measurements of the transmission spectra of 87Rb atoms at 780 nm in the vicinity of a nanofiber. A uniform distribution of fixed atoms around a nanofiber should produce a spectrum that is broadened towards the red due to shifts from the van der Waals potential. If the atoms are free, this also produces an attractive force that accelerates them until they collide with the fiber which depletes the steady-state density of near-surface atoms. It is for this reason that measurements of the van der Waals interaction are sparse. We confirm this by measuring the spectrum cold atoms from a magneto-optical trap around the fiber, revealing a symmetric line shape with nearly the natural linewidth of the transition. When we use an auxiliary 750 nm laser we are able to controllably desorb a steady flux of atoms from the fiber that reside near the surface (less than 50 nm) long enough to feel the van der Walls interaction and produce an asymmetric spectrum. We quantify the spectral asymmetry as a function of 750 nm laser power and find a maximum. Our model, which that takes into account the change in the density distribution, qualitatively explains the observations. In the future this can be used as a tool to more comprehensively study atom-surface interactions.
Resumo:
Ph.D. in the Faculty of Business Administration
Resumo:
No mundo do trabalho, o êxito depende da capacidade revelada para conhecer e desempenhar o nosso papel. E a carreira docente não foge a esta regra. À semelhança do que acontece com os actores e as actrizes - que são avaliados pela qualidade que imprimem aos papéis representados - também os juízos de valor sobre os docentes assentam na aptidão demonstrada para dominarem todas as subtilezas inerentes ao papel de professores. A sua forma de actuar perante situações melindrosas e excepcionais, de indisciplina e de violência, o seu sentir, constituirão o objecto de análise neste trabalho. Os professores, muitas vezes, actuam com base em conceitos veiculados pelos seus pais e formadores, que poderão ter sido válidos para eles, mas que não o são, decerto, hoje em dia. Podem ter resultado no início das carreiras, mas, em regra, foram perdendo utilidade em fases posteriores. Pela análise das histórias de vida dos professores levantam-se várias hipóteses quanto à construção dessas posturas e aos condimentos que possibilitam uma adesão à mudança necessária. Urge implementar uma alteração da sua maneira de pensar a abordagem às situações de indisciplina e de violência escolares, porque tal é essencial para alterar comportamentos que lhes podem ser prejudiciais. Este trabalho obedeceu a dois propósitos: proporcionar informação enquadrada em bases teóricas sólidas que possa sustentar intervenções ajustadas na sala de aula e suscitar a reflexão dos docentes sobre os seus papéis e as suas práticas. Não temos, contudo, a pretensão de que essa informação seja bastante para a resolução dos problemas de indisciplina, mas alimentamos a esperança de que as pistas de reflexão e de acção, bem como os conceitos mobilizados, possam ajudar a uma acção educativa simultaneamente mais eficaz e satisfatória. Assim, acreditamos que este trabalho estimulará os professores a encarar as situações de indisciplina e violência de um modo mais tranquilo, emocionalmente mais distante, sem o rígido recurso à auto culpabilização ou às hetero-acusações. ABSTRACT: ln the working world, success depends on the way we know and play our roles. And the teaching role is not different. An actor (or actress) is evaluated by the quality of his (or her) role performance; in the same way, the value judgements about teachers are linked with their capacity to dominate all the aspects of their teaching role. Their way to intervene in problematic and special situations, dealing with violence and indiscipline, their feelings, are the main objective of this research project. Many times, teachers act remembering what their parents and old teachers taught them. These old concepts and ideas worked well in the beginning of their careers but are not valid today. Analyzing the life stories of teachers, we can find several answers for their positions when facing these problems. It is necessary to compel them to change their minds, to transform their reactions over problematic situations of indiscipline and violence or they will face more problems in the near future. This research project has two intentions: to give theoretical information helping teachers to act correctly in the classrooms and to show that a time for reflection is necessary, to analyse their roles and procedures. We are not so naïve in thinking that this project is enough to find a solution for all the indiscipline and violence problems in schools, but we really believe that it will help teachers to face those problems in a more peaceful way, not so emotional and with better results.
Resumo:
Le concept de coopération est souvent utilisé dans le domaine de l’éthique et de la politique pour illustrer et comprendre l’alignement des comportements associatifs entre les êtres humains. En lien avec ce concept, notre recherche portera sur la première question de savoir si Kim Sterelny (2003) réussit à produire un modèle théorique permettant d’expliquer les origines et les mécanismes de la coopération humaine. Notre recherche portera aussi sur la deuxième question de savoir s’il arrive à se servir de ce modèle pour infirmer la thèse de la modularité massive. Ainsi, ce mémoire traitera successivement du problème de la coopération, de la théorie de la sélection de groupe, du déclencheur écologique de la coopération des hominidés, des notions de coalition, d’exécution et d’engagement et finalement de la thèse de la modularité massive. Par l’examen de ces sujets, nous souhaitons démontrer que Sterelny n’arrive qu’à fournir une esquisse probable des origines et du développement de la coopération humaine et que sa critique de la thèse de la modularité massive n’arrive pas à infirmer cette dernière.
Resumo:
“Seeing is believing” the proverb well suits for fluorescent imaging probes. Since we can selectively and sensitively visualize small biomolecules, organelles such as lysosomes, neutral molecules, metal ions, anions through cellular imaging, fluorescent probes can help shed light on the physiological and pathophysiological path ways. Since these biomolecules are produced in low concentrations in the biochemical pathways, general analytical techniques either fail to detect or are not sensitive enough to differentiate the relative concentrations. During my Ph.D. study, I exploited synthetic organic techniques to design and synthesize fluorescent probes with desirable properties such as high water solubility, high sensitivity and with varying fluorescent quantum yields. I synthesized a highly water soluble BOIDPY-based turn-on fluorescent probe for endogenous nitric oxide. I also synthesized a series of cell membrane permeable near infrared (NIR) pH activatable fluorescent probes for lysosomal pH sensing. Fluorescent dyes are molecular tools for designing fluorescent bio imaging probes. This prompted me to design and synthesize a hybrid fluorescent dye with a functionalizable chlorine atom and tested the chlorine re-activity for fluorescent probe design. Carbohydrate and protein interactions are key for many biological processes, such as viral and bacterial infections, cell recognition and adhesion, and immune response. Among several analytical techniques aimed to study these interactions, electrochemical bio sensing is more efficient due to its low cost, ease of operation, and possibility for miniaturization. During my Ph.D., I synthesized mannose bearing aniline molecule which is successfully tested as electrochemical bio sensor. A Ferrocene-mannose conjugate with an anchoring group is synthesized, which can be used as a potential electrochemical biosensor.
Resumo:
To describe the epidemiology of domestic swimming pool drowning and near-drowning in Brisbane and to examine the efficacy of a broad range of preventive options, including pool fences.A prospective, hospital-based, injury surveillance system to describe the epidemiology of drowning and near-drowning and a community survey to describe pool fencing.The surveillance questionnaire was completed at presentation in the Emergency Department by the parent, nurse and doctor. Personal interviews in households that were randomly selected by means of a stratified sampling scheme provided the pool fencing description.All 139 children suffering from an immersion injury resulting in presentation at a hospital in the catchment area of The Mater Children's Hospital were included. There were 204 households with a swimming pool in the 1024 households interviewed in the community survey.The 100 domestic pool drownings and near-drownings were equivalent to 15.5 incidents per year per 100,000 children aged 0-13 years and 64.9 per year per 100,000 for the critical 1-3 years age group. Of 72 children who gained unintended access to a domestic pool, 88.9% were less than 3 years of age and 52.8% were less than 2 years. All 10 of the children who drowned and five who were severely brain damaged (age range, 12-32 months) were in this group. The risk of a drowning or near-drowning involving unintended access to an unfenced pool is 3.76 times higher than the risk associated with a fenced pool (95% confidence limits for relative risk: 2.14, 6.62).Pool fences are an effective method of preventing child drownings and near-drownings. This effectiveness can be further improved if compliance with gate closure can be enhanced. This should be emphasised in health promotion accompanying the introduction of universal pool fencing. Article in The Medical journal of Australia 154(10):661-5 · June 1991
Resumo:
The popularity of cloud computing has led to a dramatic increase in the number of data centers in the world. The ever-increasing computational demands along with the slowdown in technology scaling has ushered an era of power-limited servers. Techniques such as near-threshold computing (NTC) can be used to improve energy efficiency in the post-Dennard scaling era. This paper describes an architecture based on the FD-SOI process technology for near-threshold operation in servers. Our work explores the trade-offs in energy and performance when running a wide range of applications found in private and public clouds, ranging from traditional scale-out applications, such as web search or media streaming, to virtualized banking applications. Our study demonstrates the benefits of near-threshold operation and proposes several directions to synergistically increase the energy proportionality of a near-threshold server.
Resumo:
Even more so than in other arts, film has tried to draw an artificial but clear line between eroticism and pornography, nonetheless perpetuating moral judgments about movies marketed as “erotic”. The explicit and repeated portrayal of sex in such films would place them dangerously near the vortex of the pornographic, and thus, since they are not concerned with transcendental issues, they would require little or no critical attention. I will however try to argue, using Last Tango in Paris and Une liaison pornographique, that many of these “erotic” films conclude that a relationship based solely on sex (i.e. “pornographic”), which ignores the complexities of individual identity and the interpersonal is doomed to fail. Also, I would like to show how these films ultimately conceive of sex as something that goes beyond the merely physical and walks the grounds of such transcendental issues as despair, loneliness, death, or love.
Resumo:
This research develops an econometric framework to analyze time series processes with bounds. The framework is general enough that it can incorporate several different kinds of bounding information that constrain continuous-time stochastic processes between discretely-sampled observations. It applies to situations in which the process is known to remain within an interval between observations, by way of either a known constraint or through the observation of extreme realizations of the process. The main statistical technique employs the theory of maximum likelihood estimation. This approach leads to the development of the asymptotic distribution theory for the estimation of the parameters in bounded diffusion models. The results of this analysis present several implications for empirical research. The advantages are realized in the form of efficiency gains, bias reduction and in the flexibility of model specification. A bias arises in the presence of bounding information that is ignored, while it is mitigated within this framework. An efficiency gain arises, in the sense that the statistical methods make use of conditioning information, as revealed by the bounds. Further, the specification of an econometric model can be uncoupled from the restriction to the bounds, leaving the researcher free to model the process near the bound in a way that avoids bias from misspecification. One byproduct of the improvements in model specification is that the more precise model estimation exposes other sources of misspecification. Some processes reveal themselves to be unlikely candidates for a given diffusion model, once the observations are analyzed in combination with the bounding information. A closer inspection of the theoretical foundation behind diffusion models leads to a more general specification of the model. This approach is used to produce a set of algorithms to make the model computationally feasible and more widely applicable. Finally, the modeling framework is applied to a series of interest rates, which, for several years, have been constrained by the lower bound of zero. The estimates from a series of diffusion models suggest a substantial difference in estimation results between models that ignore bounds and the framework that takes bounding information into consideration.
Resumo:
Near-infrared polarimetry observation is a powerful tool to study the central sources at the center of the Milky Way. My aim of this thesis is to analyze the polarized emission present in the central few light years of the Galactic Center region, in particular the non-thermal polarized emission of Sagittarius~A* (Sgr~A*), the electromagnetic manifestation of the super-massive black hole, and the polarized emission of an infrared-excess source in the literature referred to as DSO/G2. This source is in orbit about Sgr~A*. In this thesis I focus onto the Galactic Center observations at $\lambda=2.2~\mu m$ ($K_\mathrm{s}$-band) in polarimetry mode during several epochs from 2004 to 2012. The near-infrared polarized observations have been carried out using the adaptive optics instrument NAOS/CONICA and Wollaston prism at the Very Large Telescope of ESO (European Southern Observatory). Linear polarization at 2.2 $\mu m$, its flux statistics and time variation, can be used to constrain the physical conditions of the accretion process onto the central super-massive black hole. I present a statistical analysis of polarized $K_\mathrm{s}$-band emission from Sgr~A* and investigate the most comprehensive sample of near-infrared polarimetric light curves of this source up to now. I find several polarized flux excursions during the years and obtain an exponent of about 4 for the power-law fitted to polarized flux density distribution of fluxes above 5~mJy. Therefore, this distribution is closely linked to the single state power-law distribution of the total $K_\mathrm{s}$-band flux densities reported earlier by us. I find polarization degrees of the order of 20\%$\pm$10\% and a preferred polarization angle of $13^o\pm15^o$. Based on simulations of polarimetric measurements given the observed flux density and its uncertainty in orthogonal polarimetry channels, I find that the uncertainties of polarization parameters under a total flux density of $\sim 2\,{\mathrm{mJy}}$ are probably dominated by observational uncertainties. At higher flux densities there are intrinsic variations of polarization degree and angle within rather well constrained ranges. Since the emission is most likely due to optically thin synchrotron radiation, the obtained preferred polarization angle is very likely reflecting the intrinsic orientation of the Sgr~A* system i.e. an accretion disk or jet/wind scenario coupled to the super-massive black hole. Our polarization statistics show that Sgr~A* must be a stable system, both in terms of geometry, and the accretion process. I also investigate an infrared-excess source called G2 or Dusty S-cluster Object (DSO) moving on a highly eccentric orbit around the Galaxy's central black hole, Sgr~A*. I use for the first time the near-infrared polarimetric imaging data to determine the nature and the properties of DSO and obtain an improved $K_\mathrm{s}$-band identification of this source in median polarimetry images of different observing years. The source starts to deviate from the stellar confusion in 2008 data and it does not show a flux density variability based on our data set. Furthermore, I measure the polarization degree and angle of this source and conclude based on the simulations on polarization parameters that it is an intrinsically polarized source with a varying polarization angle as it approaches Sgr~A* position. I use the interpretation of the DSO polarimetry measurements to assess its possible properties.
Resumo:
Spectral albedo was measured along a 6 km transect near the Allan Hills in East Antarctica. The transect traversed the sequence from new snow through old snow, firn, and white ice, to blue ice, showing a systematic progression of decreasing albedo at all wavelengths, as well as decreasing specific surface area (SSA) and increasing density. Broadband albedos under clear-sky range from 0.80 for snow to 0.57 for blue ice, and from 0.87 to 0.65 under cloud. Both air bubbles and cracks scatter sunlight; their contributions to SSA were determined by microcomputed tomography on core samples of the ice. Although albedo is governed primarily by the SSA (and secondarily by the shape) of bubbles or snow grains, albedo also correlates highly with porosity, which, as a proxy variable, would be easier for ice sheet models to predict than bubble sizes. Albedo parameterizations are therefore developed as a function of density for three broad wavelength bands commonly used in general circulation models: visible, near-infrared, and total solar. Relevance to Snowball Earth events derives from the likelihood that sublimation of equatorward-flowing sea glaciers during those events progressively exposed the same sequence of surface materials that we measured at Allan Hills, with our short 6 km transect representing a transect across many degrees of latitude on the Snowball ocean. At the equator of Snowball Earth, climate models predict thick ice, or thin ice, or open water, depending largely on their albedo parameterizations; our measured albedos appear to be within the range that favors ice hundreds of meters thick. Citation:
Resumo:
Tese de Doutoramento, Ecologia (Ecologia das Populações), Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015