917 resultados para Charged System Search


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Images of the site of the Type Ic supernova (SN) 2002ap taken before explosion were analysed previously by Smartt et al. We have uncovered new unpublished, archival pre-explosion images from the Canada-France-Hawaii Telescope (CFHT) that are vastly superior in depth and image quality. In this paper we present a further search for the progenitor star of this unusual Type Ic SN. Aligning high-resolution Hubble Space Telescope observations of the SN itself with the archival CFHT images allowed us to pinpoint the location of the progenitor site on the groundbased observations. We find that a source visible in the B- and R-band pre-explosion images close to the position of the SN is (1) not coincident with the SN position within the uncertainties of our relative astrometry and (2) is still visible similar to 4.7-yr post-explosion in late-time observations taken with the William Herschel Telescope. We therefore conclude that it is not the progenitor of SN 2002ap. We derived absolute limiting magnitudes for the progenitor of M-B >= -4.2 +/- 0.5 and M-R >= -5.1 +/- 0.5. These are the deepest limits yet placed on a Type Ic SN progenitor. We rule out all massive stars with initial masses greater than 7-8 M-circle dot (the lower mass limit for stars to undergo core collapse) that have not evolved to become Wolf-Rayet stars. This is consistent with the prediction that Type Ic SNe should result from the explosions of Wolf-Rayet stars. Comparing our luminosity limits with stellar models of single stars at appropriate metallicity (Z = 0.008) and with standard mass-loss rates, we find no model that produces a Wolf-Rayet star of low enough mass and luminosity to be classed as a viable progenitor. Models with twice the standard mass-loss rates provide possible single star progenitors but all are initially more massive than 30-40 M-circle dot. We conclude that any single star progenitor must have experienced at least twice the standard mass-loss rates, been initially more massive than 30-40 M-circle dot and exploded as a Wolf-Rayet star of final mass 10-12 M-circle dot. Alternatively a progenitor star of lower initial mass may have evolved in an interacting binary system. Mazzali et al. propose such a binary scenario for the progenitor of SN 2002ap in which a star of initial mass 15-20 M-circle dot is stripped by its binary companion, becoming a 5 M-circle dot Wolf-Rayet star prior to explosion. We constrain any possible binary companion to a main-sequence star of

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have begun a search for early-type stars towards the galactic centre which are potentially young objects situated within the inner few kiloparsecs of the disk. U and V (or I) band photographic photometry from the UK Schmidt Telescope has been obtained to identify the bluest candidates in nineteen Schmidt fields (centred close to the galactic centre). We have spectroscopically observed these targets for three fields with the FLAIR multi-fibre system to determine their spectral types. In particular; ten early B-type stars have been identified and equivalent width measurements of their Balmer and HeI lines have been used to estimate atmospheric parameters. These early-type objects have magnitudes in the range 11.5 less than or equal to V less than or equal to 16.0, and our best estimates of their distance (given probable highly variable reddening in this direction together with errors in the plate photometry) suggest that some of them originated close to (i.e R-g

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gray Laboratory charged-particle microbeam has been used to assess the clonogenic ability of Chinese hamster V79 cells after irradiation of their nuclei with a precisely defined number of protons with energies of 1.0 and 3.2 MeV. The microbeam uses a 1-mum. silica capillary collimator to deliver protons to subcellular targets with high accuracy. The detection system is based on a miniature photomultiplier tube positioned above the cell dish, which detects the photons generated by the passage of the charged particles through an 18-mum-thick scintillator placed below the cells. With this system, a detection efficiency of greater than 99% is achieved. The cells are plated on specially designed dishes (3-mum-thick Mylar base), and the nuclei are identified by fluorescence microscopy. After an incubation period of 3 days, the cells are revisited individually to assess the formation of colonies from the surviving cells. For each energy investigated, the survival curve obtained for the microbeam shows a significant deviation below I Gy from a response extrapolated using the LQ model for the survival data above 1 Gy. The data are well fitted by a model that supports the hypothesis that radioresistance is induced by low-dose hypersensitivity. These studies demonstrate the potential of the microbeam for performing studies of the effects of single charged particles on cells in vitro. The hypersensitive responses observed are comparable with those reported by others using different radiations and techniques. (C) 2001 by Radiation Research Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of bit-level systolic arrays in the design of a vector quantized transformed subband coding system for speech signals is described. It is shown how the major components of this system can be decomposed into a small number of highly regular building blocks that interface directly to one another. These include circuits for the computation of the discrete cosine transform, the inverse discrete cosine transform, and vector quantization codebook search.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A bit-level systolic array system for performing a binary tree Vector Quantization codebook search is described. This consists of a linear chain of regular VLSI building blocks and exhibits data rates suitable for a wide range of real-time applications. A technique is described which reduces the computation required at each node in the binary tree to that of a single inner product operation. This method applies to all the common distortion measures (including the Euclidean distance, the Weighted Euclidean distance and the Itakura-Saito distortion measure) and significantly reduces the hardware required to implement the tree search system. © 1990 Kluwer Academic Publishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a new reconfigurable multi-standard architecture is introduced for integer-pixel motion estimation and a standard-cell based chip design study is presented. This has been designed to cover most of the common block-based video compression standards, including MPEG-2, MPEG-4, H.263, H.264, AVS and WMV-9. The architecture exhibits simpler control, high throughput and relative low hardware cost and highly competitive when compared with excising designs for specific video standards. It can also, through the use of control signals, be dynamically reconfigured at run-time to accommodate different system constraint such as the trade-off in power dissipation and video-quality. The computational rates achieved make the circuit suitable for high end video processing applications. Silicon design studies indicate that circuits based on this approach incur only a relatively small penalty in terms of power dissipation and silicon area when compared with implementations for specific standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This paper uses data provided by the Police Service for Northern Ireland (PSNI) to compare the characteristics and outcomes of reported sexual offences involving child and adult victims and explore the factors associated with case outcomes.
Method: PSNI provided data on 8,789 sexual offences recorded between April 2001 and March 2006. Case outcomes were based on whether a case was recorded by police as having sufficient evidence to summons, charge, or caution an offender (detected). Where an offender was summonsed, charged, or cautioned, this is classified as detection with a formal sanction. A case can also be classified as "detected" without a formal sanction. The analysis focused on two key categories of detection without formal sanction: cases in which the police deem there to be sufficient evidence to charge an offender but took no further action because the victim did not wish to prosecute, or because the police or the Public Prosecution Service (PPS) decided that no useful purpose would be served by proceeding.
Results: The analysis confirmed that the characteristics of recorded sexual offences involving adult and child victims vary significantly according to gender, offence type, the timing of report and victim-offender relationship. Almost half of child sex abuse cases are not detected by police and a quarter do not proceed through the criminal justice system because either the victim declines to prosecute or the Police/PPS decide not to proceed. Only one in five child cases involved detection with a formal sanction. Child groups with lower detection with formal sanction rates included children under 5, teenagers, those who do not report when the abuse occurs but disclose at a later date; and those who experience abuse at the hands of peers and adults known to them but not related. The analysis also highlighted variation in formal sanction rates depending on where the offence was reported.
Conclusions: Consideration needs to be given to improving the criminal justice response to specific child groups as well as monitoring detection rates in different police areas in order to address potential practice variation.
Practice implications: Consideration needs to be given to improving the professional response in relation to with particularly lower detection with formal sanction rates. There is also a need to monitor case outcomes to ensure that child victims in different areas receive a similar service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arguably, the myth of Shakespeare is a myth of universality. Much has been written about the dramatic, thematic and ‘humanistic’ transference of Shakespeare’s works: their permeability, transcendence of cultures and histories, geographies and temporalities. Located within this debate is a belief that this universality, among other dominating factors, is founded upon the power and poeticism of Shakespeare’s language. Subsequently, if we acknowledge Frank Kermode’s assertion that “the life of the plays is the language” and “the secret (of Shakespeare’s works) is in the detail,” what then becomes of this myth of universality, and how is Shakespeare’s language ‘transferred’ across cultures? In Asian intercultural adaptations, language becomes the primary site of confrontation as issues of semantic accuracy and poetic affiliation abound. Often, the language of the text is replaced with a cultural equivalent or reconceived with other languages of the stage – song and dance, movement and music; metaphor and imagery consequently find new voices. Yet if myth is, as Roland Barthes propounds, a second-order semiotic system that is predicated upon the already constituted sign, here being language, and myth is parasitical on language, what happens to the myth of Shakespeare in these cultural re-articulations? Wherein lies the ‘universality’? Or is ‘universality’ all that it is – an insubstantial (mythical) pageant? Using Ong Keng Sen’s Search Hamlet (2002), this paper would examine the transference of myth and / as language in intercultural Shakespeares. If, as Barthes argues, myths are to be understood as metalanguages that adumbrate social hegemonies, intercultural imaginings of Shakespeare can be said to expose the hollow myth of universality yet in a paradoxical double-bind reify and reinstate this self-same myth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Burial grounds are commonly surveyed and searched by both police/humanitarian search teams and archaeologists.
One aspect of an efficient search is to establish areas free of recent internments to allow the concentration of assets in suspect
terrain. While 100% surety in locating remains can never be achieved, the deployment of a red, amber green (RAG) system for
assessment has proven invaluable to our surveys. The RAG system is based on a desktop study (including burial ground
records), visual inspection (mounding, collapses) and use of geophysics (in this case, ground penetrating radar or GPR) for a
multi-proxy assessment that provides search authorities an assessment of the state of inhumations and a level of legal backup
for decisions they make on excavation or not (‘exit strategy’). The system is flexible and will be built upon as research
continues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A estirpe Bacillus licheniformis I89 possui a capacidade de produzir alguns compostos com actividade antibacteriana. No presente estudo, a separação desses compostos foi realizada através da aplicação de vários procedimentos, incluindo extracção em fase sólida e cromatografia liquida de alta pressão. Dois destes compostos bioactivos constituem o lantibiótico de classe II lichenicidina e são caracterizados pela massas molecular de 3250 Da (Bliα) e 3020 Da (Bliβ). O cluster responsável pela biossíntese da lichenicidina foi heterologamente expresso em Escherichia coli, constituindo a primeira descrição da produção de um lantibiótico totalmente in vivo num hospedeiro Gram-negativo. Este sistema foi subsequentemente explorado com o objectivo de relacionar cada proteína codificada no cluster genético da lichenicidina na produção dos péptidos Bliα e Bliβ. O desenvolvimento do sistema de trans complementação possibilitou a produção de variantes destes péptidos. A análise das massas moleculares destas variantes assim como a análise dos padrões de fragmentação obtidos por MS/MS permitiu a revisão de algumas das características estruturais previamente proposta para Bliα e Bliβ. A análise dos genes hipoteticamente envolvidos na protecção da estirpe produtora contra a acção antibiótica da lichenicidina revelou, que em E. coli, a sua ausência não resulta no aumento da susceptibilidade a este composto. Verificou-se também que a presença destes genes não é essencial para a produção de lichenicidina em E. coli. Foi também confirmado experimentalmente que a membrana externa da E. coli constitui uma barreira natural para a entrada dos péptidos na célula. De facto, uma das características intrigantes da produção de lichenicidina por uma bactéria de Gram negativo reside no mecanismo de transporte dos dois péptidos através da membrana externa. Neste estudo foi demonstrado que na ausência da proteína de membrana TolC, a massa molecular de Bliα e Bliβ não foi identificada no sobrenadante de E. coli, demonstrando assim que a sua presença no ambiente extra-celular não se devia a um processo de lise bacteriana. Foi ainda avaliada a capacidade da maquinaria biossintética da lichenicidina para produzir o lantibiótico haloduracina, através do processamento de chimeras lichenicidina-haloduracina, contudo, os resultados foram negativos. Verificou-se ainda que em determinadas condições de incubação, a diferenciação da morfologia original da estirpe B. licheniformis I89 pode ocorrer. Esta dissociação implicou a transição da colónia parental e rugosa para uma colónia de aparência mais simples e suave. Desta forma, as diferenças das duas morfologias em termos de taxa de crescimento, esporulação e actividade antibiótica foram investigadas. Considerando especificamente Bliα e Bliβ verificou-se que a abundância destes péptidos nas culturas do fenótipo fino é geralmente inferior aquela identificada nas culturas do fenótipo parental. Por último, a diversidade de elementos genéticos constituintes de péptido sintetases não ribossomais (NRPS) foi investigada em lagoas no centro de Portugal e em solos provenientes de caves do sul de Portugal, revelando a presença de potenciais novas NRPS nestes ambientes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the design and implementation of a reliable centimeter-level indoor positioning system fully compatible with a conventional smartphone. The proposed system takes advantage of the smartphone audio I/O and processing capabilities to perform acoustic ranging in the audio band using non-invasive audio signals and it has been developed having in mind applications that require high accuracy, such as augmented reality, virtual reality, gaming and audio guides. The system works in a distributed operation mode, i.e. each smartphone is able to obtain its own position using only acoustic signals. To support the positioning system, a Wireless Sensor Network (WSN) of synchronized acoustic beacons is used. To keep the infrastructure in sync we have developed an Automatic Time Synchronization and Syntonization (ATSS) protocol with a standard deviation of the sync offset error below 1.25 μs. Using an improved Time Difference of Arrival (TDoA) estimation approach (which takes advantage of the beacon signals’ periodicity) and by performing Non-Line-of-Sight (NLoS) mitigation, we were able to obtain very stable and accurate position estimates with an absolute mean error of less than 10 cm in 95% of the cases and a mean standard deviation of 2.2 cm for a position refresh period of 350 ms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Ph.D., by thesis, proposes a speculative lens to read Internet Art via the concept of digital debris. In order to do so, the research explores the idea of digital debris in Internet Art from 1993 to 2011 in a series of nine case studies. Here, digital debris are understood as words typed in search engines and which then disappear; bits of obsolete codes which are lingering on the Internet, abandoned website, broken links or pieces of ephemeral information circulating on the Internet and which are used as a material by practitioners. In this context, the thesis asks what are digital debris? The thesis argues that the digital debris of Internet Art represent an allegorical and entropic resistance to the what Art Historian David Joselit calls the Epistemology of Search. The ambition of the research is to develop a language in-between the agency of the artist and the autonomy of the algorithm, as a way of introducing Internet Art to a pluridisciplinary audience, hence the presence of the comparative studies unfolding throughout the thesis, between Internet Art and pionners in the recycling of waste in art, the use of instructions as a medium and the programming of poetry. While many anthropological and ethnographical studies are concerned with the material object of the computer as debris once it becomes obsolete, very few studies have analysed waste as discarded data. The research shifts the focus from an industrial production of digital debris (such as pieces of hardware) to obsolete pieces of information in art practice. The research demonstrates that illustrations of such considerations can be found, for instance, in Cory Arcangel’s work Data Diaries (2001) where QuickTime files are stolen, disassembled, and then re-used in new displays. The thesis also looks at Jodi’s approach in Jodi.org (1993) and Asdfg (1998), where websites and hyperlinks are detourned, deconstructed, and presented in abstract collages that reveals the architecture of the Internet. The research starts in a typological manner and classifies the pieces of Internet Art according to the structure at play in the work. Indeed if some online works dealing with discarded documents offer a self-contained and closed system, others nurture the idea of openness and unpredictability. The thesis foregrounds the ideas generated through the artworks and interprets how those latter are visually constructed and displayed. Not only does the research questions the status of digital debris once they are incorporated into art practice but it also examine the method according to which they are retrieved, manipulated and displayed to submit that digital debris of Internet Art are the result of both semantic and automated processes, rendering them both an object of discourse and a technical reality. Finally, in order to frame the serendipity and process-based nature of the digital debris, the Ph.D. concludes that digital debris are entropic . In other words that they are items of language to-be, paradoxically locked in a constant state of realisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In proposing theories of how we should design and specify networks of processes it is necessary to show that the semantics of any language we use to write down the intended behaviours of a system has several qualities. First in that the meaning of what is written on the page reflects the intention of the designer; second that there are no unexpected behaviours that might arise in a specified system that are hidden from the unsuspecting specifier; and third that the intention for the design of the behaviour of a network of processes can be communicated clearly and intuitively to others. In order to achieve this we have developed a variant of CSP, called CSPt, designed to solve the problems of termination of parallel processes present in the original formulation of CSP. In CSPt we introduced three parallel operators, each with a different kind of termination semantics, which we call synchronous, asynchronous and race. These operators provide specifiers with an expressive and flexible tool kit to define the intended behaviour of a system in such a way that unexpected or unwanted behaviours are guaranteed not to take place. In this paper we extend out analysis of CSPt and introduce the notion of an alphabet diagram that illustrates the different categories of events that can arise in the parallel composition of processes. These alphabet diagrams are then used to analyse networks of three processes in parallel with the aim of identifying sufficient constraints to ensure associativity of their parallel composition. Having achieved this we then proceed to prove associativity laws for the three parallel operators of CSPt. Next, we illustrate how to design and construct a network of three processes that satisfy the associativity law, using the associativity theorem and alphabet diagrams. Finally, we outline how this could be achieved for more general networks of processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manufacturing system has a natural dynamic nature observed through several kinds of random occurrences and perturbations on working conditions and requirements over time. For this kind of environment it is important the ability to efficient and effectively adapt, on a continuous basis, existing schedules according to the referred disturbances, keeping performance levels. The application of Meta-Heuristics and Multi-Agent Systems to the resolution of this class of real world scheduling problems seems really promising. This paper presents a prototype for MASDScheGATS (Multi-Agent System for Distributed Manufacturing Scheduling with Genetic Algorithms and Tabu Search).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To comply with natural gas demand growth patterns and Europe´s import dependency, the gas industry needs to organize an efficient upstream infrastructure. The best location of Gas Supply Units – GSUs and the alternative transportation mode – by phisical or virtual pipelines, are the key of a successful industry. In this work we study the optimal location of GSUs, as well as determining the most efficient allocation from gas loads to sources, selecting the best transportation mode, observing specific technical restrictions and minimizing system total costs. For the location of GSUs on system we use the P-median problem, for assigning gas demands nodes to source facilities we use the classical transportation problem. The developed model is an optimisation-based approach, based on a Lagrangean heuristic, using Lagrangean relaxation for P-median problems – Simple Lagrangean Heuristic. The solution of this heuristic can be improved by adding a local search procedure - the Lagrangean Reallocation Heuristic. These two heuristics, Simple Lagrangean and Lagrangean Reallocation, were tested on a realistic network - the primary Iberian natural gas network, organized with 65 nodes, connected by physical and virtual pipelines. Computational results are presented for both approaches, showing the location gas sources and allocation loads arrangement, system total costs and gas transportation mode.