993 resultados para Digital processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A specific manufacturing process to obtain continuous glass fiber-reinforced RIFE laminates was studied and some of their mechanical properties were evaluated. Young's modulus and maximum strength were measured by three-point bending test and tensile test using the Digital Image Correlation (DIC) technique. Adhesion tests, thermal analysis and microscopy were used to evaluate the fiber-matrix adhesion, which is very dependent on the sintering time. The composite material obtained had a Young's modulus of 14.2 GPa and ultimate strength of 165 MPa, which corresponds to approximately 24 times the modulus and six times the ultimate strength of pure RIFE. These results show that the RIFE composite, manufactured under specific conditions, has great potential to provide structural parts with a performance suitable for application in structural components. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid-state shear pulverization (SSSP) is a unique processing technique for mechanochemical modification of polymers, compatibilization of polymer blends, and exfoliation and dispersion of fillers in polymer nanocomposites. A systematic parametric study of the SSSP technique is conducted to elucidate the detailed mechanism of the process and establish the basis for a range of current and future operation scenarios. Using neat, single component polypropylene (PP) as the model material, we varied machine type, screw design, and feed rate to achieve a range of shear and compression applied to the material, which can be quantified through specific energy input (Ep). As a universal processing variable, Ep reflects the level of chain scission occurring in the material, which correlates well to the extent of the physical property changes of the processed PP. Additionally, we compared the operating cost estimates of SSSP and conventional twin screw extrusion to determine the practical viability of SSSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the effect of level-of-processing manipulations on “remember” and “know” responses in episodic melody recognition (Experiments 1 and 2) and how this effect is modulated by item familiarity (Experiment 2). In Experiment 1, participants performed 2 conceptual and 2 perceptual orienting tasks while listening to familiar melodies: judging the mood, continuing the tune, tracing the pitch contour, and counting long notes. The conceptual mood task led to higher d' rates for “remember” but not “know” responses. In Experiment 2, participants either judged the mood or counted long notes of tunes with high and low familiarity. A level-of-processing effect emerged again in participants’ “remember” d' rates regardless of melody familiarity. Results are discussed within the distinctive processing framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a recent offering of a linear systems and signal processing course for third-year electrical and computer engineering students. This course is a pre-requisite for our first digital signal processing course. Students have traditionally viewed linear systems courses as mathematical and extremely difficult. Without compromising the rigor of the required concepts, we strived to make the course fun, with application-based hands-on laboratory projects. These projects can be modified easily to meet specific instructors' preferences. © 2011 IEEE.(17 refs)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid-state shear pulverization (SSSP) is a unique processing technique for mechanochemical modification of polymers, compatibilization of polymer blends, and exfoliation and dispersion of fillers in polymer nanocomposites. A systematic parametric study of the SSSP technique is conducted to elucidate the detailed mechanism of the process and establish the basis for a range of current and future operation scenarios. Using neat, single component polypropylene (PP) as the model material, we varied machine type, screw design, and feed rate to achieve a range of shear and compression applied to the material, which can be quantified through specific energy input (Ep). As a universal processing variable, Ep reflects the level of chain scission occurring in the material, which correlates well to the extent of the physical property changes of the processed PP. Additionally, we compared the operating cost estimates of SSSP and conventional twin screw extrusion to determine the practical viability of SSSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fracture properties of high-strength spray-formed Al alloys were investigated, with consideration of the effects of elemental additions such as zinc,manganese, and chromium and the influence of the addition of SiC particulate. Fracture resistance values between 13.6 and 25.6 MPa (m)1/2 were obtained for the monolithic alloys in the T6 and T7 conditions, respectively. The alloys with SiC particulate compared well and achieved fracture resistance values between 18.7 and 25.6 MPa (m)1/2. The spray-formed materials exhibited a loss in fracture resistance (KI) compared to ingot metallurgy 7075 alloys but had an improvedperformance compared to high-solute powder metallurgy alloys of similar composition. Characterization of the fracture surfaces indicated a predominantly intergranular decohesion, possibly facilitated by the presence of incoherent particles at the grain boundary regions and by the large strength differentialbetween the matrix and precipitate zone. It is believed that at the slip band-grain boundary intersection, particularly in the presence of large dispersoids and/or inclusions, microvoid nucleation would be significantly enhanced. Differences in fracture surfaces between the alloys in the T6 and T7 condition were observed and are attributed to inhomogeneous slip distribution, which results in strain localization at grain boundaries. The best overall combination of fracture resistance properties were obtained for alloys with minimum amounts of chromium and manganese additions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigates multiple processing parameters, includingpolymer type, filler type, processing technique, severity of SSSP (Solid-state shear pulverization)processing, and postprocessing, of SSSP. HDPE and LLDPE polymers with pristine clay and organo-clay samples are explored. Effects on crystallization, high-temperature behavior, mechanicalproperties, and gas barrier properties are examined. Thermal, mechanical, and morphological characterization is conducted to determine polymer/filler compatibility and superior processing methods for the polymer-clay nanocomposites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodegradable nanoparticles are at the forefront of drug delivery research as they provide numerous advantages over traditional drug delivery methods. An important factor affecting the ability of nanoparticles to circulate within the blood stream and interact with cells is their morphology. In this study a novel processing method, confined impinging jet mixing, was used to form poly (lactic acid) nanoparticles through a solvent-diffusion process with Pluronic F-127 being used as a stabilizing agent. This study focused on the effects of Reynolds number (flow rate), surfactant presence in mixing, and polymer concentration on the morphology of poly (lactic acid) nanoparticles. In addition to looking at the parameters affecting poly (lactic acid) morphology, this study attempted to improve nanoparticle isolation and purification methods to increase nanoparticle yield and ensure specific morphologies were not being excluded during isolation and purification. The isolation and purification methods used in this study were centrifugation and a stir cell. This study successfully produced particles having pyramidal and cubic morphologies. Despite successful production of these morphologies the yield of non-spherical particles was very low, additionally great variability existed between redundant trails. Surfactant was determined to be very important for the stabilization of nanoparticles in solution but appears to be unnecessary for the formation of nanoparticles. Isolation and purification methods that produce a high yield of surfactant free particles have still not been perfected and additional testing will be necessary for improvement.¿

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents two frameworks- a software framework and a hardware core manager framework- which, together, can be used to develop a processing platform using a distributed system of field-programmable gate array (FPGA) boards. The software framework providesusers with the ability to easily develop applications that exploit the processing power of FPGAs while the hardware core manager framework gives users the ability to configure and interact with multiple FPGA boards and/or hardware cores. This thesis describes the design and development of these frameworks and analyzes the performance of a system that was constructed using the frameworks. The performance analysis included measuring the effect of incorporating additional hardware components into the system and comparing the system to a software-only implementation. This work draws conclusions based on the provided results of the performance analysis and offers suggestions for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodegradable polymer/clay nanocomposites were prepared withpristine and organically modified montmorillonite in polylactic acid (PLA) and polycaprolactone (PCL) polymer matrices. Nanocomposites were fabricated using extrusion and SSSP to compare the effects of melt-state and solid-state processing on the morphology of the final nanocomposite. Characterization of various material properties was performed on prepared biodegradable polymer/clay nanocomposites to evaluate property enhancements from different clays and/or processing methods.