27 resultados para Astronomy, Egyptian.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food is a vital foundation of all human life. It is essential to a myriad of political, socio-cultural, economic and environmental practices throughout history. However, those practices of food production, consumption, and distribution have the potential to now go through immensely transformative shifts as network technologies become increasingly embedded in every domain of contemporary life. Information and communication technologies (ICTs) are one of the key foundations of global functionality and sustenance today and undoubtedly will continue to present new challenges and opportunities for the future. As such, this Symposium will bring together leading scholars across disciplines to address challenges and opportunities at the intersection of food and ICTs in everyday urban environment. In particular, the discussion will revolve around the question: What are the key roles that network technologies play in re-shaping the food systems at micro- to macroscopic level? The symposium will contribute a unique perspective on urban food futures through the lens of network society paradigm where ICTs enable innovations in production, organisation, and communication within society. Some of the topics addressed will include encouraging transparency in food commodity chains; value of cultural understanding and communication in global food sustainability; and technologies to social inclusion; all of which evoke and examine the question surrounding networked individuals as changes catalysts for urban food futures. The event will provide an avenue for new discussions and speculations on key issues surrounding urban food futures in the network era, with a particular focus on bottom-up micro actions that challenge the existing food systems towards a broader sociocultural, political, technological, and environmental transformations. One central area of concern is that current systems of food production, distribution, and consumption do not ensure food security for the future, but rather seriously threaten it. With the recent unprecedented scale of urban growth and rise of middle-class, the problem continues to intensify. This situation requires extensive distribution networks to feed urban residents, and therefore poses significant infrastructural challenges to both the public and private sectors. The symposium will also address the transferability of citizen empowerment that network technologies enable as demonstrated in various significant global political transformations from the bottom-up, such as the recent Egyptian Youth Revolution. Another key theme of the discussion will be the role of ICTs (and the practices that they mediate) in fostering transparency in commodity chains. The symposium will ask what differences these technologies can make on the practices of food consumption and production. After discussions, we will initiate an international network of food-thinkers and actors that will function as a platform for knowledge sharing and collaborations. The participants will be invited to engage in planning for the on-going future development of the network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The underlying objective of this study was to develop a novel approach to evaluate the potential for commercialisation of a new technology. More specifically, this study examined the 'ex-ante'. evaluation of the technology transfer process. For this purpose, a technology originating from the high technology sector was used. The technology relates to the application of software for the detection of weak signals from space, which is an established method of signal processing in the field of radio astronomy. This technology has the potential to be used in commercial and industrial areas other than astronomy, such as detecting water leakages in pipes. Its applicability to detecting water leakage was chosen owing to several problems with detection in the industry as well as the impact it can have on saving water in the environment. This study, therefore, will demonstrate the importance of interdisciplinary technology transfer. The study employed both technical and business evaluation methods including laboratory experiments and the Delphi technique to address the research questions. There are several findings from this study. Firstly, scientific experiments were conducted and these resulted in a proof of concept stage of the chosen technology. Secondly, validation as well as refinement of criteria from literature that can be used for „ex-ante. evaluation of technology transfer has been undertaken. Additionally, after testing the chosen technology.s overall transfer potential using the modified set of criteria, it was found that the technology is still in its early stages and will require further development for it to be commercialised. Furthermore, a final evaluation framework was developed encompassing all the criteria found to be important. This framework can help in assessing the overall readiness of the technology for transfer as well as in recommending a viable mechanism for commercialisation. On the whole, the commercial potential of the chosen technology was tested through expert opinion, thereby focusing on the impact of a new technology and the feasibility of alternate applications and potential future applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polymerase chain reaction (PCR) was developed for the detection of Banana bunchy top virus (BBTV) at maximum after 210 min and at minimum after 90 min using Pc-1 and Pc-2, respectively. PCR detection of BBTV in crude sap indicated that the freezing of banana tissue in liquid nitrogen (LN2) before extraction was more effective than using sand as the extraction technique. BBTV was also detected using PCR assay in 69 healthy and diseased plants using Na-PO4 buffer containing 1 % SDS. PCR detection of BBTV in nucleic acid extracts using seven different extraction buffers to adapt the use of PCR in routine detection in the field was studied. Results proved that BBTV was detected with high sensitivity in nucleic acid extracts more than in infectious sap. The results also suggested the common aetiology for the BBTV by the PCR reactions of BBTV in nucleic acid extracts from Australia, Burundi, Egypt, France, Gabon, Philippines and Taiwan. Results also proved a positive relation between the Egyptian-BBTV isolate and abaca bunchy top isolate from the Philippines, but there no relation was found with the Cucumber mosaic cucumovirus (CMV) isolates from Egypt and Philippines and Banana bract mosaic virus (BBMV) were found.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a simple activity for plotting and characterising the light curve from an exoplanet transit event by way of differential photometry analysis. Using free digital imaging software, participants analyse a series of telescope images with the goal of calculating various exoplanet parameters, including its size, orbital radius and habitability. The activity has been designed for a high-school or undergraduate university level and introduces fundamental concepts in astrophysics and an understanding of the basis for exoplanetary science, the transit method and digital photometry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recent theoretical investigation by Terzieva & Herbst of linear carbon chains, C-n where n greater than or equal to 6, in the interstellar medium has shown that these species can undergo efficient radiative association to form the corresponding anions. An experimental study by Barckholtz, Snow & Bierbaum of these anions has demonstrated that they do not react efficiently with molecular hydrogen, leading to the possibility of detectable abundances of cumulene-type anions in dense interstellar and circumstellar environments. Here we present a series of electronic structure calculations which examine possible anionic candidates for detection in these media, namely the anion analogues of the previously identified interstellar cumulenes CnH and Cn-1CH2 and heterocumulenes CnO (where n = 2-10). The extraordinary electron affinities calculated for these molecules suggest that efficient radiative electron attachment could occur, and the large dipole moments of these simple (generally) linear molecules point to the possibility of detection by radio astronomy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents Australian results from the Interests and Recruitment in Science (IRIS) study with respect to the influence of STEM-related mass media, including science fiction, on students’ decisions to enrol in university STEM courses. The study found that across the full cohort (N=2999), students tended to attribute far greater influence to science-related documentaries/channels such as Life on Earth and the Discovery Channel, etc. than to science-fiction movies or STEM-related TV dramas. Males were more inclined than females to consider science fiction/fantasy books and films and popular science books/magazines as having been important in their decisions. Students taking physics/astronomy tended to rate the importance of science fiction/fantasy books and films higher than students in other courses. The implications of these results for our understanding of influences on STEM enrolments are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Historically, science had a place in education before the time of Plato and Aristotle (e.g., Stonehenge). Technology gradually increased since early human inventions (e.g., indigenous tools and weapons), rose up dramatically through the industrial revolution and escalated exponentially during the twentieth and twenty-first centuries, particularly with the advent of the Internet. Engineering accomplishments were evident in the constructs of early civil works, including roads and structural feats such as the Egyptian pyramids. Mathematics was not as clearly defined BC (Seeds 2010), but was utilized for more than two millennia (e.g., Archimedes, Kepler, and Newton) and paved its way into education as an essential scientific tool and a way of discovering new possibilities. Hence, combining science, technology, engineering, and mathematics (STEM) areas should not come as a surprise but rather as a unique way of packaging what has been ..."--Publisher Website

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cliché about modern architecture being the fairy-tale fulfillment of every fantasy ceases to be a cliché only when it is accompanied by the fairy tale’s moral: that the fulfillment of the wishes rarely engenders goodness in the one doing the wishing (Adorno). Wishing for the right things in architecture and the city is the most difficult art of all: since the grim childhood-tales of the twentieth century we have been weaned from dreams and utopias, the stuff of modernism’s bad conscience. For Adorno writing in 1953, Hollywood cinema was a medium of “regression” based on infantile wish fulfillment manufactured by the industrial repetition (mimesis) of the filmic image that he called a modern “hieroglyphics,” like the archaic language of pictures in Ancient Egypt which guaranteed immortality after death in Egyptian burial rites. Arguably, today the iconic architecture industry is the executor of archaic images of modernity linked to rituals of death, promises of omnipotence and immortality. As I will argue in this symposium, such buildings are not a reflection of external ‘reality,’ but regression to an internal architectural polemic that secretly carries out the rituals of modernism’s death and seeks to make good on the liabilities of architectural history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article describes a parallax experiment performed by undergraduate physics students at Queensland University of Technology. The experiment is analogous to the parallax method used in astronomy to measure distances to the local stars. The result of one of these experiments is presented in this paper. A target was photographed using a digital camera at five distances between 3 and 8 metres from two vantage points spaced 0.6 m apart. The parallax distances were compared with the actual distance measured using a tape measure and the average error was 0.5 ± 0.9 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For Adorno writing in 1953, Hollywood cinema was a medium of “regression” based on infantile wish fulfillment manufactured by the industrial repetition of the filmic image that he called a modern “hieroglyphics”—like the archaic language of pictures in Ancient Egypt, which guaranteed immortality after death in Egyptian burial rites. From that 1953 essay Prolog zum Fernsehen to Das Schema der Massenkultur in 1981, Adorno likened film frames to cultural ideograms: What he called the filmic “language of images” (Bildersprache) constituted a Hieroglyphenschrift that visualised forbidden sexual impulses and ideations of death and domination in the unconscious of the mass spectator. In his famous passage he writes, “As image, the image-writing (Bilderschrift) is a medium of regression, where the producer and consumer coincide; as writing, film resurrects the archaic images of modernity.” In other words, cinema takes the spectator on a journey into his unconscious in order to control him from within. It works, because the spectator begins to believe the film is speaking to him in his very own image-language (the unconscious), making him do and buy whatever capitalism demands. Modernity for Adorno is precisely the instrumentalisation of the collective unconscious through the mediatic images of the culture industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims We combine measurements of weak gravitational lensing from the CFHTLS-Wide survey, supernovae Ia from CFHT SNLS and CMB anisotropies from WMAP5 to obtain joint constraints on cosmological parameters, in particular, the dark-energy equation-of-state parameter w. We assess the influence of systematics in the data on the results and look for possible correlations with cosmological parameters. Methods We implemented an MCMC algorithm to sample the parameter space of a flat CDM model with a dark-energy component of constant w. Systematics in the data are parametrised and included in the analysis. We determine the influence of photometric calibration of SNIa data on cosmological results by calculating the response of the distance modulus to photometric zero-point variations. The weak lensing data set is tested for anomalous field-to-field variations and a systematic shape measurement bias for high-redshift galaxies. Results Ignoring photometric uncertainties for SNLS biases cosmological parameters by at most 20% of the statistical errors, using supernovae alone; the parameter uncertainties are underestimated by 10%. The weak-lensing field-to-field variance between 1 deg2-MegaCam pointings is 5-15% higher than predicted from N-body simulations. We find no bias in the lensing signal at high redshift, within the framework of a simple model, and marginalising over cosmological parameters. Assuming a systematic underestimation of the lensing signal, the normalisation increases by up to 8%. Combining all three probes we obtain -0.10 < 1 + w < 0.06 at 68% confidence ( -0.18 < 1 + w < 0.12 at 95%), including systematic errors. Our results are therefore consistent with the cosmological constant . Systematics in the data increase the error bars by up to 35%; the best-fit values change by less than 0.15.