981 resultados para Astronomy, Egyptian.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
Food is a vital foundation of all human life. It is essential to a myriad of political, socio-cultural, economic and environmental practices throughout history. However, those practices of food production, consumption, and distribution have the potential to now go through immensely transformative shifts as network technologies become increasingly embedded in every domain of contemporary life. Information and communication technologies (ICTs) are one of the key foundations of global functionality and sustenance today and undoubtedly will continue to present new challenges and opportunities for the future. As such, this Symposium will bring together leading scholars across disciplines to address challenges and opportunities at the intersection of food and ICTs in everyday urban environment. In particular, the discussion will revolve around the question: What are the key roles that network technologies play in re-shaping the food systems at micro- to macroscopic level? The symposium will contribute a unique perspective on urban food futures through the lens of network society paradigm where ICTs enable innovations in production, organisation, and communication within society. Some of the topics addressed will include encouraging transparency in food commodity chains; value of cultural understanding and communication in global food sustainability; and technologies to social inclusion; all of which evoke and examine the question surrounding networked individuals as changes catalysts for urban food futures. The event will provide an avenue for new discussions and speculations on key issues surrounding urban food futures in the network era, with a particular focus on bottom-up micro actions that challenge the existing food systems towards a broader sociocultural, political, technological, and environmental transformations. One central area of concern is that current systems of food production, distribution, and consumption do not ensure food security for the future, but rather seriously threaten it. With the recent unprecedented scale of urban growth and rise of middle-class, the problem continues to intensify. This situation requires extensive distribution networks to feed urban residents, and therefore poses significant infrastructural challenges to both the public and private sectors. The symposium will also address the transferability of citizen empowerment that network technologies enable as demonstrated in various significant global political transformations from the bottom-up, such as the recent Egyptian Youth Revolution. Another key theme of the discussion will be the role of ICTs (and the practices that they mediate) in fostering transparency in commodity chains. The symposium will ask what differences these technologies can make on the practices of food consumption and production. After discussions, we will initiate an international network of food-thinkers and actors that will function as a platform for knowledge sharing and collaborations. The participants will be invited to engage in planning for the on-going future development of the network.
Resumo:
The underlying objective of this study was to develop a novel approach to evaluate the potential for commercialisation of a new technology. More specifically, this study examined the 'ex-ante'. evaluation of the technology transfer process. For this purpose, a technology originating from the high technology sector was used. The technology relates to the application of software for the detection of weak signals from space, which is an established method of signal processing in the field of radio astronomy. This technology has the potential to be used in commercial and industrial areas other than astronomy, such as detecting water leakages in pipes. Its applicability to detecting water leakage was chosen owing to several problems with detection in the industry as well as the impact it can have on saving water in the environment. This study, therefore, will demonstrate the importance of interdisciplinary technology transfer. The study employed both technical and business evaluation methods including laboratory experiments and the Delphi technique to address the research questions. There are several findings from this study. Firstly, scientific experiments were conducted and these resulted in a proof of concept stage of the chosen technology. Secondly, validation as well as refinement of criteria from literature that can be used for „ex-ante. evaluation of technology transfer has been undertaken. Additionally, after testing the chosen technology.s overall transfer potential using the modified set of criteria, it was found that the technology is still in its early stages and will require further development for it to be commercialised. Furthermore, a final evaluation framework was developed encompassing all the criteria found to be important. This framework can help in assessing the overall readiness of the technology for transfer as well as in recommending a viable mechanism for commercialisation. On the whole, the commercial potential of the chosen technology was tested through expert opinion, thereby focusing on the impact of a new technology and the feasibility of alternate applications and potential future applications.
Resumo:
Polymerase chain reaction (PCR) was developed for the detection of Banana bunchy top virus (BBTV) at maximum after 210 min and at minimum after 90 min using Pc-1 and Pc-2, respectively. PCR detection of BBTV in crude sap indicated that the freezing of banana tissue in liquid nitrogen (LN2) before extraction was more effective than using sand as the extraction technique. BBTV was also detected using PCR assay in 69 healthy and diseased plants using Na-PO4 buffer containing 1 % SDS. PCR detection of BBTV in nucleic acid extracts using seven different extraction buffers to adapt the use of PCR in routine detection in the field was studied. Results proved that BBTV was detected with high sensitivity in nucleic acid extracts more than in infectious sap. The results also suggested the common aetiology for the BBTV by the PCR reactions of BBTV in nucleic acid extracts from Australia, Burundi, Egypt, France, Gabon, Philippines and Taiwan. Results also proved a positive relation between the Egyptian-BBTV isolate and abaca bunchy top isolate from the Philippines, but there no relation was found with the Cucumber mosaic cucumovirus (CMV) isolates from Egypt and Philippines and Banana bract mosaic virus (BBMV) were found.
Resumo:
This paper describes a simple activity for plotting and characterising the light curve from an exoplanet transit event by way of differential photometry analysis. Using free digital imaging software, participants analyse a series of telescope images with the goal of calculating various exoplanet parameters, including its size, orbital radius and habitability. The activity has been designed for a high-school or undergraduate university level and introduces fundamental concepts in astrophysics and an understanding of the basis for exoplanetary science, the transit method and digital photometry.
Resumo:
A recent theoretical investigation by Terzieva & Herbst of linear carbon chains, C-n where n greater than or equal to 6, in the interstellar medium has shown that these species can undergo efficient radiative association to form the corresponding anions. An experimental study by Barckholtz, Snow & Bierbaum of these anions has demonstrated that they do not react efficiently with molecular hydrogen, leading to the possibility of detectable abundances of cumulene-type anions in dense interstellar and circumstellar environments. Here we present a series of electronic structure calculations which examine possible anionic candidates for detection in these media, namely the anion analogues of the previously identified interstellar cumulenes CnH and Cn-1CH2 and heterocumulenes CnO (where n = 2-10). The extraordinary electron affinities calculated for these molecules suggest that efficient radiative electron attachment could occur, and the large dipole moments of these simple (generally) linear molecules point to the possibility of detection by radio astronomy.
Resumo:
This paper presents Australian results from the Interests and Recruitment in Science (IRIS) study with respect to the influence of STEM-related mass media, including science fiction, on students’ decisions to enrol in university STEM courses. The study found that across the full cohort (N=2999), students tended to attribute far greater influence to science-related documentaries/channels such as Life on Earth and the Discovery Channel, etc. than to science-fiction movies or STEM-related TV dramas. Males were more inclined than females to consider science fiction/fantasy books and films and popular science books/magazines as having been important in their decisions. Students taking physics/astronomy tended to rate the importance of science fiction/fantasy books and films higher than students in other courses. The implications of these results for our understanding of influences on STEM enrolments are discussed.
Resumo:
"Historically, science had a place in education before the time of Plato and Aristotle (e.g., Stonehenge). Technology gradually increased since early human inventions (e.g., indigenous tools and weapons), rose up dramatically through the industrial revolution and escalated exponentially during the twentieth and twenty-first centuries, particularly with the advent of the Internet. Engineering accomplishments were evident in the constructs of early civil works, including roads and structural feats such as the Egyptian pyramids. Mathematics was not as clearly defined BC (Seeds 2010), but was utilized for more than two millennia (e.g., Archimedes, Kepler, and Newton) and paved its way into education as an essential scientific tool and a way of discovering new possibilities. Hence, combining science, technology, engineering, and mathematics (STEM) areas should not come as a surprise but rather as a unique way of packaging what has been ..."--Publisher Website
Resumo:
The cliché about modern architecture being the fairy-tale fulfillment of every fantasy ceases to be a cliché only when it is accompanied by the fairy tale’s moral: that the fulfillment of the wishes rarely engenders goodness in the one doing the wishing (Adorno). Wishing for the right things in architecture and the city is the most difficult art of all: since the grim childhood-tales of the twentieth century we have been weaned from dreams and utopias, the stuff of modernism’s bad conscience. For Adorno writing in 1953, Hollywood cinema was a medium of “regression” based on infantile wish fulfillment manufactured by the industrial repetition (mimesis) of the filmic image that he called a modern “hieroglyphics,” like the archaic language of pictures in Ancient Egypt which guaranteed immortality after death in Egyptian burial rites. Arguably, today the iconic architecture industry is the executor of archaic images of modernity linked to rituals of death, promises of omnipotence and immortality. As I will argue in this symposium, such buildings are not a reflection of external ‘reality,’ but regression to an internal architectural polemic that secretly carries out the rituals of modernism’s death and seeks to make good on the liabilities of architectural history.
Resumo:
This article describes a parallax experiment performed by undergraduate physics students at Queensland University of Technology. The experiment is analogous to the parallax method used in astronomy to measure distances to the local stars. The result of one of these experiments is presented in this paper. A target was photographed using a digital camera at five distances between 3 and 8 metres from two vantage points spaced 0.6 m apart. The parallax distances were compared with the actual distance measured using a tape measure and the average error was 0.5 ± 0.9 %.
Resumo:
For Adorno writing in 1953, Hollywood cinema was a medium of “regression” based on infantile wish fulfillment manufactured by the industrial repetition of the filmic image that he called a modern “hieroglyphics”—like the archaic language of pictures in Ancient Egypt, which guaranteed immortality after death in Egyptian burial rites. From that 1953 essay Prolog zum Fernsehen to Das Schema der Massenkultur in 1981, Adorno likened film frames to cultural ideograms: What he called the filmic “language of images” (Bildersprache) constituted a Hieroglyphenschrift that visualised forbidden sexual impulses and ideations of death and domination in the unconscious of the mass spectator. In his famous passage he writes, “As image, the image-writing (Bilderschrift) is a medium of regression, where the producer and consumer coincide; as writing, film resurrects the archaic images of modernity.” In other words, cinema takes the spectator on a journey into his unconscious in order to control him from within. It works, because the spectator begins to believe the film is speaking to him in his very own image-language (the unconscious), making him do and buy whatever capitalism demands. Modernity for Adorno is precisely the instrumentalisation of the collective unconscious through the mediatic images of the culture industry.
Resumo:
Detection of gamma-ray emissions from a class of active galactic nuclei (viz blazars), has been one of the important findings from the Compton Gamma-Ray Observatory (CGRO). However, their gamma-ray luminosity function has not-been well determined. Few attempts have been made in earlier works, where BL Lacs and Flat Spectrum Radio Quasars (FSRQs) have been considered as a single source class. In this paper, we investigated the evolution and gamma-ray luminosity function of FSRQs and BL Lacs separately. Our investigation indicates no evolution for BL Lacs, however FSRQs show significant evolution. Pure luminosity evolution is assumed for FSRQs and exponential and power law evolution models are examined. Due to the small number of sources, the low luminosity end index of the luminosity function for FSRQs is constrained with an upper limit. BL Lac luminosity function shows no signature of break. As a consistency check, the model source distributions derived from these luminosity functions show no significant departure from the observed source distributions.
Resumo:
We believe the Babcock-Leighton process of poloidal field generation to be the main source of irregularity in the solar cycle. The random nature of this process may make the poloidal field in one hemisphere stronger than that in the other hemisphere at the end of a cycle. We expect this to induce an asymmetry in the next sunspot cycle. We look for evidence of this in the observational data and then model it theoretically with our dynamo code. Since actual polar field measurements exist only from the 1970s, we use the polar faculae number data recorded by Sheeley (1991, 2008) as a proxy of the polar field and estimate the hemispheric asymmetry of the polar field in different solar minima during the major part of the twentieth century. This asymmetry is found to have a reasonable correlation with the asymmetry of the next cycle. We then run our dynamo code by feeding information about this asymmetry at the successive minima and compare the results with observational data. We find that the theoretically computed asymmetries of different cycles compare favorably with the observational data, with the correlation coefficient being 0.73. Due to the coupling between the two hemispheres, any hemispheric asymmetry tends to get attenuated with time. The hemispheric asymmetry of a cycle either from observational data or from theoretical calculations statistically tends to be less than the asymmetry in the polar field (as inferred from the faculae data) in the preceding minimum. This reduction factor turns out to be 0.43 and 0.51 respectively in observational data and theoretical simulations.
Resumo:
We provide a 2.5-dimensional solution to a complete set of viscous hydrodynamical equations describing accretion- induced outflows and plausible jets around black holes/compact objects. We prescribe a self-consistent advective disk-outflow coupling model, which explicitly includes the information of vertical flux. Inter-connecting dynamics of an inflow-outflow system essentially upholds the conservation laws. We provide a set of analytical family of solutions through a self-similar approach. The flow parameters of the disk-outflow system depend strongly on the viscosity parameter α and the cooling factor.
Resumo:
We investigate the evolution of rotation period and spindown age of a pulsar whose surface magnetic field undergoes a phase of growth. Application of these results to the Crab pulsar strongly indicates that its parameters cannot be accounted for by the field growth theories.