889 resultados para FILTER
Resumo:
Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.
Resumo:
This paper presents a recursive strategy for online detection of actuator faults on a unmanned aerial system (UAS) subjected to accidental actuator faults. The proposed detection algorithm aims to provide a UAS with the capability of identifying and determining characteristics of actuator faults, offering necessary flight information for the design of fault-tolerant mechanism to compensate for the resultant side-effect when faults occur. The proposed fault detection strategy consists of a bank of unscented Kalman filters (UKFs) with each one detecting a specific type of actuator faults and estimating corresponding velocity and attitude information. Performance of the proposed method is evaluated using a typical nonlinear UAS model and it is demonstrated in simulations that our method is able to detect representative faults with a sufficient accuracy and acceptable time delay, and can be applied to the design of fault-tolerant flight control systems of UASs.
Resumo:
Aerosol mass spectrometers (AMS) are powerful tools in the analysis of the chemical composition of airborne particles, particularly organic aerosols which are gaining increasing attention. However, the advantages of AMS in providing on-line data can be outweighed by the difficulties involved in its use in field measurements at multiple sites. In contrast to the on-line measurement by AMS, a method which involves sample collection on filters followed by subsequent analysis by AMS could significantly broaden the scope of AMS application. We report the application of such an approach to field studies at multiple sites. An AMS was deployed at 5 urban schools to determine the sources of the organic aerosols at the schools directly. PM1 aerosols were also collected on filters at these and 20 other urban schools. The filters were extracted with water and the extract run through a nebulizer to generate the aerosols, which were analysed by an AMS. The mass spectra from the samples collected on filters at the 5 schools were found to have excellent correlations with those obtained directly by AMS, with r2 ranging from 0.89 to 0.98. Filter recoveries varied between the schools from 40 -115%, possibly indicating that this method provides qualitative rather than quantitative information. The stability of the organic aerosols on Teflon filters was demonstrated by analysing samples stored for up to two years. Application of the procedure to the remaining 20 schools showed that secondary organic aerosols were the main source of aerosols at the majority of the schools. Overall, this procedure provides accurate representation of the mass spectra of ambient organic aerosols and could facilitate rapid data acquisition at multiple sites where AMS could not be deployed for logistical reasons.
Resumo:
This research project contributed to the in-depth understanding of the influence of hydrologic and hydraulic factors on the stormwater treatment performance of constructed wetlands and bioretention basins in the "real world". The project was based on the comprehensive monitoring of a Water Sensitive Urban Design treatment train in the field and underpinned by complex multivariate statistical analysis. The project outcomes revealed that the reduction in pollutant concentrations were consistent in the constructed wetland, but was highly variable in the bioretention basin to a range of influential factors. However, due to the significant amount retention within the filter media, all pollutant loadings were reduced in the bioretention basin.
Resumo:
A new bioluminescent creatine kinase (CK) assay using purified luciferase was used to analyse CK activity in serum samples dried on filter paper. Enzyme activity was preserved for over 1 wk on paper stored at room temperature. At 60°C, CK activity in liquid serum samples was rapidly inactivated, but the activity of enzyme stored on paper was preserved for at least 2 days.
Resumo:
Electrostatic spinning or electrospinning is a fiber spinning technique driven by a high-voltage electric field that produces fibers with diameters in a submicrometer to nanometer range.1 Nanofibers are typical one-dimensional colloidal objects with an increased tensile strength, whose length can achieve a few kilometers and the specific surface area can be 100 m2 g–1 or higher.2 Nano- and microfibers from biocompatible polymers and biopolymers have received much attention in medical applications3 including biomedical structural elements (scaffolding used in tissue engineering,2,4–6 wound dressing,7 artificial organs and vascular grafts8), drug and vaccine delivery,9–11 protective shields in speciality fabrics, multifunctional membranes, etc. Other applications concern superhydrophobic coatings,12 encapsulation of solid materials,13 filter media for submicron particles in separation industry, composite reinforcement and structures for nano-electronic machines.
Resumo:
Streamciphers are common cryptographic algorithms used to protect the confidentiality of frame-based communications like mobile phone conversations and Internet traffic. Streamciphers are ideal cryptographic algorithms to encrypt these types of traffic as they have the potential to encrypt them quickly and securely, and have low error propagation. The main objective of this thesis is to determine whether structural features of keystream generators affect the security provided by stream ciphers.These structural features pertain to the state-update and output functions used in keystream generators. Using linear sequences as keystream to encrypt messages is known to be insecure. Modern keystream generators use nonlinear sequences as keystream.The nonlinearity can be introduced through a keystream generator's state-update function, output function, or both. The first contribution of this thesis relates to nonlinear sequences produced by the well-known Trivium stream cipher. Trivium is one of the stream ciphers selected in a final portfolio resulting from a multi-year project in Europe called the ecrypt project. Trivium's structural simplicity makes it a popular cipher to cryptanalyse, but to date, there are no attacks in the public literature which are faster than exhaustive keysearch. Algebraic analyses are performed on the Trivium stream cipher, which uses a nonlinear state-update and linear output function to produce keystream. Two algebraic investigations are performed: an examination of the sliding property in the initialisation process and algebraic analyses of Trivium-like streamciphers using a combination of the algebraic techniques previously applied separately by Berbain et al. and Raddum. For certain iterations of Trivium's state-update function, we examine the sets of slid pairs, looking particularly to form chains of slid pairs. No chains exist for a small number of iterations.This has implications for the period of keystreams produced by Trivium. Secondly, using our combination of the methods of Berbain et al. and Raddum, we analysed Trivium-like ciphers and improved on previous on previous analysis with regards to forming systems of equations on these ciphers. Using these new systems of equations, we were able to successfully recover the initial state of Bivium-A.The attack complexity for Bivium-B and Trivium were, however, worse than exhaustive keysearch. We also show that the selection of stages which are used as input to the output function and the size of registers which are used in the construction of the system of equations affect the success of the attack. The second contribution of this thesis is the examination of state convergence. State convergence is an undesirable characteristic in keystream generators for stream ciphers, as it implies that the effective session key size of the stream cipher is smaller than the designers intended. We identify methods which can be used to detect state convergence. As a case study, theMixer streamcipher, which uses nonlinear state-update and output functions to produce keystream, is analysed. Mixer is found to suffer from state convergence as the state-update function used in its initialisation process is not one-to-one. A discussion of several other streamciphers which are known to suffer from state convergence is given. From our analysis of these stream ciphers, three mechanisms which can cause state convergence are identified.The effect state convergence can have on stream cipher cryptanalysis is examined. We show that state convergence can have a positive effect if the goal of the attacker is to recover the initial state of the keystream generator. The third contribution of this thesis is the examination of the distributions of bit patterns in the sequences produced by nonlinear filter generators (NLFGs) and linearly filtered nonlinear feedback shift registers. We show that the selection of stages used as input to a keystream generator's output function can affect the distribution of bit patterns in sequences produced by these keystreamgenerators, and that the effect differs for nonlinear filter generators and linearly filtered nonlinear feedback shift registers. In the case of NLFGs, the keystream sequences produced when the output functions take inputs from consecutive register stages are less uniform than sequences produced by NLFGs whose output functions take inputs from unevenly spaced register stages. The opposite is true for keystream sequences produced by linearly filtered nonlinear feedback shift registers.
Resumo:
The early warning based on real-time prediction of rain-induced instability of natural residual slopes helps to minimise human casualties due to such slope failures. Slope instability prediction is complicated, as it is influenced by many factors, including soil properties, soil behaviour, slope geometry, and the location and size of deep cracks in the slope. These deep cracks can facilitate rainwater infiltration into the deep soil layers and reduce the unsaturated shear strength of residual soil. Subsequently, it can form a slip surface, triggering a landslide even in partially saturated soil slopes. Although past research has shown the effects of surface-cracks on soil stability, research examining the influence of deep-cracks on soil stability is very limited. This study aimed to develop methodologies for predicting the real-time rain-induced instability of natural residual soil slopes with deep cracks. The results can be used to warn against potential rain-induced slope failures. The literature review conducted on rain induced slope instability of unsaturated residual soil associated with soil crack, reveals that only limited studies have been done in the following areas related to this topic: - Methods for detecting deep cracks in residual soil slopes. - Practical application of unsaturated soil theory in slope stability analysis. - Mechanistic methods for real-time prediction of rain induced residual soil slope instability in critical slopes with deep cracks. Two natural residual soil slopes at Jombok Village, Ngantang City, Indonesia, which are located near a residential area, were investigated to obtain the parameters required for the stability analysis of the slope. A survey first identified all related field geometrical information including slope, roads, rivers, buildings, and boundaries of the slope. Second, the electrical resistivity tomography (ERT) method was used on the slope to identify the location and geometrical characteristics of deep cracks. The two ERT array models employed in this research are: Dipole-dipole and Azimuthal. Next, bore-hole tests were conducted at different locations in the slope to identify soil layers and to collect undisturbed soil samples for laboratory measurement of the soil parameters required for the stability analysis. At the same bore hole locations, Standard Penetration Test (SPT) was undertaken. Undisturbed soil samples taken from the bore-holes were tested in a laboratory to determine the variation of the following soil properties with the depth: - Classification and physical properties such as grain size distribution, atterberg limits, water content, dry density and specific gravity. - Saturated and unsaturated shear strength properties using direct shear apparatus. - Soil water characteristic curves (SWCC) using filter paper method. - Saturated hydraulic conductivity. The following three methods were used to detect and simulate the location and orientation of cracks in the investigated slope: (1) The electrical resistivity distribution of sub-soil obtained from ERT. (2) The profile of classification and physical properties of the soil, based on laboratory testing of soil samples collected from bore-holes and visual observations of the cracks on the slope surface. (3) The results of stress distribution obtained from 2D dynamic analysis of the slope using QUAKE/W software, together with the laboratory measured soil parameters and earthquake records of the area. It was assumed that the deep crack in the slope under investigation was generated by earthquakes. A good agreement was obtained when comparing the location and the orientation of the cracks detected by Method-1 and Method-2. However, the simulated cracks in Method-3 were not in good agreement with the output of Method-1 and Method-2. This may have been due to the material properties used and the assumptions made, for the analysis. From Method-1 and Method-2, it can be concluded that the ERT method can be used to detect the location and orientation of a crack in a soil slope, when the ERT is conducted in very dry or very wet soil conditions. In this study, the cracks detected by the ERT were used for stability analysis of the slope. The stability of the slope was determined using the factor of safety (FOS) of a critical slip surface obtained by SLOPE/W using the limit equilibrium method. Pore-water pressure values for the stability analysis were obtained by coupling the transient seepage analysis of the slope using finite element based software, called SEEP/W. A parametric study conducted on the stability of an investigated slope revealed that the existence of deep cracks and their location in the soil slope are critical for its stability. The following two steps are proposed to predict the rain-induced instability of a residual soil slope with cracks. (a) Step-1: The transient stability analysis of the slope is conducted from the date of the investigation (initial conditions are based on the investigation) to the preferred date (current date), using measured rainfall data. Then, the stability analyses are continued for the next 12 months using the predicted annual rainfall that will be based on the previous five years rainfall data for the area. (b) Step-2: The stability of the slope is calculated in real-time using real-time measured rainfall. In this calculation, rainfall is predicted for the next hour or 24 hours and the stability of the slope is calculated one hour or 24 hours in advance using real time rainfall data. If Step-1 analysis shows critical stability for the forthcoming year, it is recommended that Step-2 be used for more accurate warning against the future failure of the slope. In this research, the results of the application of the Step-1 on an investigated slope (Slope-1) showed that its stability was not approaching a critical value for year 2012 (until 31st December 2012) and therefore, the application of Step-2 was not necessary for the year 2012. A case study (Slope-2) was used to verify the applicability of the complete proposed predictive method. A landslide event at Slope-2 occurred on 31st October 2010. The transient seepage and stability analyses of the slope using data obtained from field tests such as Bore-hole, SPT, ERT and Laboratory tests, were conducted on 12th June 2010 following the Step-1 and found that the slope in critical condition on that current date. It was then showing that the application of the Step-2 could have predicted this failure by giving sufficient warning time.
Resumo:
Physical and chemical properties of biodiesel are influenced by structural features of the fatty acids, such as chain length, degree of unsaturation and branching of the carbon chain. This study investigated if microalgal fatty acid profiles are suitable for biodiesel characterization and species selection through Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Graphical Analysis for Interactive Assistance (GAIA) analysis. Fatty acid methyl ester (FAME) profiles were used to calculate the likely key chemical and physical properties of the biodiesel [cetane number (CN), iodine value (IV), cold filter plugging point, density, kinematic viscosity, higher heating value] of nine microalgal species (this study) and twelve species from the literature, selected for their suitability for cultivation in subtropical climates. An equal-parameter weighted (PROMETHEE-GAIA) ranked Nannochloropsis oculata, Extubocellulus sp. and Biddulphia sp. highest; the only species meeting the EN14214 and ASTM D6751-02 biodiesel standards, except for the double bond limit in the EN14214. Chlorella vulgaris outranked N. oculata when the twelve microalgae were included. Culture growth phase (stationary) and, to a lesser extent, nutrient provision affected CN and IV values of N. oculata due to lower eicosapentaenoic acid (EPA) contents. Application of a polyunsaturated fatty acid (PUFA) weighting to saturation led to a lower ranking of species exceeding the double bond EN14214 thresholds. In summary, CN, IV, C18:3 and double bond limits were the strongest drivers in equal biodiesel parameter-weighted PROMETHEE analysis.
Resumo:
Non-periodic structural variation has been found in the high Tc cuprates, YBa2Cu3O7-x and Hg0.67Pb0.33Ba2Ca2Cu 3O8+δ, by image analysis of high resolution transmission electron microscope (HRTEM) images. We use two methods for analysis of the HRTEM images. The first method is a means for measuring the bending of lattice fringes at twin planes. The second method is a low-pass filter technique which enhances information contained by diffuse-scattered electrons and reveals what appears to be an interference effect between domains of differing lattice parameter in the top and bottom of the thin foil. We believe that these methods of image analysis could be usefully applied to the many thousands of HRTEM images that have been collected by other workers in the high temperature superconductor field. This work provides direct structural evidence for phase separation in high Tc cuprates, and gives support to recent stripes models that have been proposed to explain various angle resolved photoelectron spectroscopy and nuclear magnetic resonance data. We believe that the structural variation is a response to an opening of an electronic solubility gap where holes are not uniformly distributed in the material but are confined to metallic stripes. Optimum doping may occur as a consequence of the diffuse boundaries between stripes which arise from spinodal decomposition. Theoretical ideas about the high Tc cuprates which treat the cuprates as homogeneous may need to be modified in order to take account of this type of structural variation.
Resumo:
This research investigated airborne particle characteristics and their dynamics inside and around the envelope of mechanically ventilated office buildings, together with building thermal conditions and energy consumption. Based on these, a comprehensive model was developed to facilitate the optimisation of building heating, ventilation and air conditioning systems, in order to protect the health of their occupants and minimise the energy requirements of these buildings.
Resumo:
The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.
Resumo:
In many applications, where encrypted traffic flows from an open (public) domain to a protected (private) domain, there exists a gateway that bridges the two domains and faithfully forwards the incoming traffic to the receiver. We observe that indistinguishability against (adaptive) chosen-ciphertext attacks (IND-CCA), which is a mandatory goal in face of active attacks in a public domain, can be essentially relaxed to indistinguishability against chosen-plaintext attacks (IND-CPA) for ciphertexts once they pass the gateway that acts as an IND-CCA/CPA filter by first checking the validity of an incoming IND-CCA ciphertext, then transforming it (if valid) into an IND-CPA ciphertext, and forwarding the latter to the recipient in the private domain. “Non-trivial filtering'' can result in reduced decryption costs on the receivers' side. We identify a class of encryption schemes with publicly verifiable ciphertexts that admit generic constructions of (non-trivial) IND-CCA/CPA filters. These schemes are characterized by existence of public algorithms that can distinguish between valid and invalid ciphertexts. To this end, we formally define (non-trivial) public verifiability of ciphertexts for general encryption schemes, key encapsulation mechanisms, and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption flavours. We further analyze the security impact of public verifiability and discuss generic transformations and concrete constructions that enjoy this property.
Resumo:
The term fashion system describes inter-relationships between production and consumption illustrating how the production of fashion is a collective activity. For instance, Yuniya Kawamura notes systems for the production of fashion differ around the globe and are subject to constant change, and Jennifer Craik draws attention to an ‘array of competing and intermeshing systems cutting across western and non-western cultures. In China, Shanghai’s nascent fashion system seeks to emulate the Eurocentric system of Fashion Weeks and industry support groups. It promises emergent designers a platform for global competition, yet there are tensions from within. Interaction with a fashion system inevitably means becoming validated or legitimised. Legitimisation in turn depends upon gatekeepers who make aesthetic judgments about the status, quality and cultural value of a designers work. Notwithstanding the proliferation of fashion media, in Shanghai a new gatekeeper has arrived, seeking to filter authenticity from artifice, offering truth in a fashion market saturated with fakery and the hollowness of foreign consumptive practice, and providing a place of sanctuary for Chinese fashion design. Thus this paper discusses how new agencies are allowing designers in Shanghai greater control over their brand image while creating novel opportunities for promotion and sales. It explores why designers choose this new model and provides new knowledge of the curation of fashion by these gatekeepers.
Resumo:
The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.