472 resultados para Common core standards
Resumo:
Differential axial deformation between column elements and shear wall elements of cores increase with building height and geometric complexity. Adverse effects due to the differential axial deformation reduce building performance and life time serviceability. Quantifying axial deformations using ambient measurements from vibrating wire, external mechanical and electronic strain gauges in order to acquire adequate provisions to mitigate the adverse effects is well established method. However, these gauges require installing in or on elements to acquire continuous measurements and hence use of these gauges is uneconomical and inconvenient. This motivates to develop a method to quantify the axial deformations. This paper proposes an innovative method based on modal parameters to quantify axial deformations of shear wall elements in cores of buildings. Capabilities of the method are presented though an illustrative example.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.
Resumo:
We have previously reported the use of a novel mini-sequencing protocol for detection of the factor V Leiden variant, the first nucleotide change (FNC) technology. This technology is based on a single nucleotide extension of a primer, which is hybridized immediately adjacent to the site of mutation. The extended nucleotide that carries a reporter molecule (fluorescein) has the power to discriminate the genotype at the site of mutation. More recently, the prothrombin 20210 and thermolabile methylene tetrahydrofolate reductase (MTHFR) 677 variants have been identified as possible risk factors associated with thrombophilia. This study describes the use of the FNC technology in a combined assay to detect factor V, prothrombin and MTHFR variants in a population of Australian blood donors, and describes the objective numerical methodology used to determine genotype cut-off values for each genetic variation. Using FNC to test 500 normal blood donors, the incidence of Factor V Leiden was 3.6% (all heterozygous), that of prothrombin 20210 was 2.8% (all heterozygous) and that of MTHFR was 10% (homozygous). The combined FNC technology offers a simple, rapid, automatable DNA-based test for the detection of these three important mutations that are associated with familial thrombophilia. (C) 2000 Lippincott Williams and Wilkins.
Resumo:
The complex transition from convict to free labour influenced state intervention in the employment relationship, and initiated the first minimum labour standards in Australia in 1828. Since then, two principal sets of tensions have affected the enforcement of such standards: tensions between government and employers, and tensions between the major political parties over industrial and economic issues. This article argues that these tensions have resulted in a sustained legacy affecting minimum labour standards’ enforcement in Australia. The article outlines broad historical developments and contexts of minimum labour standards’ enforcement in Australia since 1828, with more contemporary exploration focusing specifically on enforcement practices and policies in the Australian federal industrial relations jurisdiction. Current enforcement practices are an outcome of this volatile history, and past influences remain strong.
Resumo:
BACKGROUND Endometriosis is a polygenic disease with a complex and multifactorial aetiology that affects 8-10% of women of reproductive age. Epidemiological data support a link between endometriosis and cancers of the reproductive tract. Fibroblast growth factor receptor 2 (FGFR2) has recently been implicated in both endometrial and breast cancer. Our previous studies on endometriosis identified significant linkage to a novel susceptibility locus on chromosome 10q26 and the FGFR2 gene maps within this linkage region. We therefore hypothesized that variation in FGFR2 may contribute to the risk of endometriosis. METHODS We genotyped 13 single nucleotide polymorphisms (SNPs) densely covering a 27 kb region within intron 2 of FGFR2 including two SNPs (rs2981582 and rs1219648) significantly associated with breast cancer and a total 40 tagSNPs across 150 kb of the FGFR2 gene. SNPs were genotyped in 958 endometriosis cases and 959 unrelated controls. RESULTS We found no evidence for association between endometriosis and FGFR2 intron 2 SNPs or SNP haplotypes and no evidence for association between endometriosis and variation across the FGFR2 gene. CONCLUSIONS Common variation in the breast-cancer implicated intron 2 and other highly plausible causative candidate regions of FGFR2 do not appear to be a major contributor to endometriosis susceptibility in our large Australian sample.
Resumo:
One of the fundamental issues that remains unresolved in patent law today, both in Australia and in other jurisdictions, is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether Australian patent law contains a physicality requirement. Despite being recently considered by the Federal Court, this is arguably an issue that has yet to be satisfactorily resolved in Australia. In its 2006 decision in Grant v Commissioner of Patents, the Full Court of the Federal Court of Australia found that the patentable subject matter standard is rooted in the physical, when it held that an invention must involve a physical effect or transformation to be patent eligible. That decision, however, has been the subject of scrutiny in the academic literature. This article seeks to add to the existing literature written in response to the Grant decision by examining in detail the key common law cases decided prior to the High Court’s watershed decision in National Research Development Corporation v Commissioner of Patents, which is the undisputed authoritative statement of principle in regards to the patentable subject matter standard in Australia. This article, in conjunction with others written by the author, questions the Federal Court’s assertion in Grant that the physicality requirement it established is consistent with existing law.
Resumo:
This paper suggests that when a course is planned within one culture for delivery to members of another culture, appropriate quality control of assessment becomes an issue of major proportions. Based on their experience of presenting an Aid Agency-funded Masters course in a developing country in the Pacific, the authors describe the processes to address the needs and wants of all the stakeholders, with different cultural expectations. Maintaining a balance between domestic and Pacific student cohorts regarding resources and opportunities for study was especially challenging. However, grounding grades in course curriculum and clearly stated objectives permitted the teaching team to meet external requirements while maintaining their professional and academic freedom.
Resumo:
Germ-line mutations in CDKN2A have been shown to predispose to cutaneous malignant melanoma. We have identified 2 new melanoma kindreds which carry a duplication of a 24bp repeat present in the 5' region of CDKN2A previously identified in melanoma families from Australia and the United States. This mutation has now been reported in 5 melanoma families from 3 continents: Europe, North America, and Australasia. The M53I mutation in exon 2 of CDKN2A has also been documented in 5 melanoma families from Australia and North America. The aim of this study was to determine whether the occurrence of the mutations in these families from geographically diverse populations represented mutation hotspots within CDKN2A or were due to common ancestors. Haplotypes of 11 microsatellite markers flanking CDKN2A were constructed in 5 families carrying the M53I mutation and 5 families carrying the 24bp duplication. There were some differences in the segregating haplotypes due primarily to recombinations and mutations within the short tandem-repeat markers; however, the data provide evidence to indicate that there were at least 3 independent 24bp duplication events and possibly only 1 original M53I mutation. This is the first study to date which indicates common founders in melanoma families from different continents.
Resumo:
This ALTC Teaching Fellowship aimed to establish Guiding Principles for Library and Information Science Education 2.0. The aim was achieved by (i) identifying the current and anticipated skills and knowledge required by successful library and information science (LIS) professionals in the age of web 2.0 (and beyond), (ii) establishing the current state of LIS education in Australia in supporting the development of librarian 2.0, and in doing so, identify models of best practice.
The fellowship has contributed to curriculum renewal in the LIS profession. It has helped to ensure that LIS education in Australia continues to meet the changing skills and knowledge requirements of the profession it supports. It has also provided a vehicle through which LIS professionals and LIS educators may find opportunities for greater collaboration and more open communication. This will help bridge the gap between LIS theory and practice and will foster more authentic engagement between LIS education and other parts of the LIS industry in the education of the next generation of professionals. Through this fellowship the LIS discipline has become a role model for other disciplines who will be facing similar issues in the coming years.
Eighty-one members of the Australian LIS profession participated in a series of focus groups exploring the current and anticipated skills and knowledge needed by the LIS professional in the web 2.0 world and beyond. Whilst each focus group tended to draw on specific themes of interest to that particular group of people, there was a great deal of common ground. Eight key themes emerged: technology, learning and education, research or evidence-based practice, communication, collaboration and team work, user focus, business savvy and personal traits.
It was acknowledged that the need for successful LIS professionals to possess transferable skills and interpersonal attributes was not new. It was noted however that the speed with which things are changing in the web 2.0 world was having a significant impact and that this faster pace is placing a new and unexpected emphasis on the transferable skills and knowledge. It was also acknowledged that all librarians need to possess these skills, knowledge and attributes and not just the one or two role models who lead the way.
The most interesting finding however was that web 2.0, library 2.0 and librarian 2.0 represented a ‘watershed’ for the LIS profession. Almost all the focus groups spoke about how they are seeing and experiencing a culture change in the profession. Librarian 2.0 requires a ‘different mindset or attitude’. The Levels of Perspective model by Daniel Kim provides one lens by which to view this finding. The focus group findings suggest that we are witnessing a re-awaking of the Australian LIS profession as it begins to move towards the higher levels of Kim’s model (ie mental models, vision).
Thirty-six LIS educators participated in telephone interviews aimed at exploring the current state of LIS education in supporting the development of librarian 2.0. Skills and knowledge of LIS professionals in a web 2.0 world that were identified and discussed by the LIS educators mirrored those highlighted in the focus group discussions with LIS professionals. Similarly it was noted that librarian 2.0 needed a focus less on skills and knowledge and more on attitude. However, whilst LIS professionals felt that there was a paradigm shift within the profession. LIS educators did not speak with one voice on this matter with quite a number of the educators suggesting that this might be ‘overstating it a bit’. This study provides evidence for “disparate viewpoints” (Hallam, 2007) between LIS educators and LIS professionals that can have a significant implications for the future of not just LIS professional education specifically but for the profession generally.
Library and information science education 2.0: guiding principles and models of best practice 1
Inviting the LIS academics to discuss how their teaching and learning activities support the development of librarian 2.0 was a core part of the interviews conducted. The strategies used and the challenges faced by LIS educators in developing their teaching and learning approaches to support the formation of librarian 2.0 are identified and discussed. A core part of the fellowship was the identification of best practice examples on how LIS educators were developing librarian 2.0. Twelve best practice examples were identified. Each educator was recorded discussing his or her approach to teaching and learning. Videos of these interviews are available via the Fellowship blog at