806 resultados para Internet of things, Mqtt, domotica, Raspberry Pi
Resumo:
A new intellectual epoch has generated new enterprises to suit changed beliefs and circumstances. A widespread sentiment in both formal historiography and curriculum studies reduces the “new” to the question of how knowledge is recognized as such, how it is gained, and how it is represented in narrative form. Whether the nature of history and conceptions of knowledge are, or ought to be, central considerations in curriculum studies and reducible to purposes or elevated as present orientated requires rethinking. This paper operates as an incitement to discourse that disrupts the protection and isolation of primary categories in the field whose troubling is overdue. In particular, the paper moves through several layers that highlight the lack of settlement regarding the endowment of objects for study with the status of the scientific. It traces how some “invisible” things have been included within the purview of curriculum history as objects of study and not others. The focus is the making of things deemed invisible into scientific objects (or not) and the specific site of analysis is the work of William James (1842-1910). James studied intensely both child mind and the ghost, the former of which becomes scientized and legitimated for further study, the latter abjected. This contrast opens key points for reconsideration regarding conditions of proof, validation criteria, and subject matters and points to opportunities to challenge some well-rehearsed foreclosures within progressive politics and education.
Resumo:
My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.
Resumo:
This paper concentrates on Heraclitus, Parmenides and Lao Zi. The focus is on their ideas on change and whether the world is essentially One or if it is composed of many entities. In the first chapter I go over some general tendences in Greek and Chinese philosophy. The differences in the cultural background have an influence in the ways philosophy is made, but the paper aims to show that two questions can be brought up when comparing the philosophies of Heraclitus, Parmenides and Lao Zi. The questions are; is the world essentially One or Many? Is change real and if it is, what is the nature of it and how does it take place? For Heraclitus change is real, and as will be shown later in the chapter, quite essential for the sustainability of the world-order (kosmos). The key-concept in the case of Heraclitus is Logos. Heraclitus uses Logos in several senses, most well known relating to his element-theory. But another important feature of the Logos, the content of real wisdom, is to be able to regard everything as one. This does not mean that world is essentially one for Heraclitus in the ontological sense, but that we should see the underlying unity of multiple phenomena. Heraclitus regards this as hen panta: All from One, One from All. I characterize Heraclitus as epistemic monist and an ontological pluralist. It is plausible that the views of Heraclitus on change were the focus of Parmenides’ severe criticism. Parmenides held the view that the world is essentially one and that to see it as consisting of many entities was the error of mortals, i.e. the common man and his philosophical predecessors. For Parmenides what-is, can be approached by two routes; The Way of Truth (Aletheia) and The Way of Seeming (Doxa). Aletheia essentially sees the world as one, where even time is an illusion. In Doxa Parmenides is giving an explanation of the world seen as consisting of many entities and this is his contribution to the line of thought of his predecessors. It should be noted that a strong emphasis is given to the Aletheia, whereas the world-view given is in Doxa is only probable. I go on to describe Parmenides as ontological monist, who gives some plausibility to pluralistic views. In the work of Lao Zi world can be seen as One or as consisting of Many entities. In my interpretation, Lao Zi uses Dao in two different senses; Dao is the totality of things or the order in change. The wu-aspect (seeing-without-form) attends the world as one, whereas the you-aspect attends the world of many entities. In wu-aspect, Dao refers to the totality of things, when in you-aspect Dao is the order or law in change. There are two insights in Lao Zi regarding the relationship between wu- and- you-apects; in ch.1 it is stated that they are two separate aspects in seeing the world, the other chapters regarding that you comes from wu. This naturally brings in the question whether the One is the peak of seeing the world as many. In other words, is there a way from pluralism to monism. All these considerations make it probable that the work attributed to Lao Zi has been added new material or is a compilation of oral sayings. In the end of the paper I will go on to give some insights on how Logos and Dao can be compared in a relevant manner. I also compare Parmenides holistic monism to Lao Zi’s Dao as nameless totality (i.e. in its wu-aspect). I briefly touch the issues of Heidegger and the future of comparative philosophy.
Resumo:
For decades psychoanalysis was the discipline studying the unconscious, and other branches of study lacked competence to take a stand on the issues concerning the unconscious. From 1980s onwards intense study of the unconscious has been taken place in the scope of cognitive orientation. Thus, nowadays it is talked about both pyschoanalytic and cognitive unconscious. The aim of this thesis is to integrate psychoanalytic and cognitive views. When the "Freudian" conception of the unconscious is considered, there are four entangled issues: 1) what is the unconscious like, 2) how does the unconsciuos give rise to psychic disorders 3) why and how certain contents are missing from consciousness (repression of contents), 4) the emergence of those contents (becoming conscious of the repressed). The conventional psychoanalytic answer to the first question - and "the cornerstone of psychoanalysis" - is "the unconscious is mental". The issues 2)-4) depend radically on the answer given to the 1): "psychoanalytic" conceptualizations on them rest on the "cornerstone". That ground was challended in study I: it was argued that it has never been clear what does it mean that the unconscious is mental. Thus, it was stated that in the current state of art psychoanalysis should drop out the ephitet "mental" before the term unconscious. That claim creates a pressure to reappraise the convential "psychoanalytic" answers to the other questions, and that reappraisal was the aim of studies II and III. In study II the question 2) is approached in terms of implicit knowledge. Study III focuses on mechanisms, which determine which contents appear in the scope of consciousness, and also cause missing of contents from there (the questions 3) and 4)). In the core of study III there are distinctions concerning the processess occuring in the levels of the brain, consciousness, self-consciousness, and narrative self-consciousness. Studies I-III set "psychoanalytic" topics in the frames of cognitive view. The picture emerging from those studies is not especially useful for a clinican (psychotherapist). Studies IV and V focused that issue. Study IV is a rather serious critique toward neuropsychoanalysis. In it it was claimed that repressive functions of conscious states are in the core of clinical psychoanalysis, and functions in general cannot be reduced to neurophysiological terminology. Thus, the limits of neuropsychoanalysis are more strict than it has been realized: crucial clinical issues remain outside its scope. In study V it was focused on the confusing state of things that although unconscious fantasies do not exist, the idea on them has been an important conceptual tool for clinicans. When put in a larger context, the aim of study V is similar to that of study IV: to determine the relation between psychotherapists' and neuroscientists' terminologies. Studies III, IV and V apply the philosopher Daniel Dennett's model on different levels of explanation.
Resumo:
We study the constraints arising on the expansion parameters c and d of the pion electromagnetic form factor from the inclusion of pure spacelike data and the phase of timelike data along with one spacelike datum, using as input the first derivative of the QCD polarization amplitude Pi'(-Q(2)). These constraints when combined with other analyses, provide a valuable check on a determination of c due to Guo et al. and on our previous work where pionic contribution to the (g - 2) of the muon was used as the input. This work further illustrates the power of analyticity techniques in form factor analysis.
Resumo:
1. The mechanism of absorption of phosphatidylcholine was studied in rats by injecting into the intestine phosphatidylcholine specifically labelled either in the fatty acid or in the glycerol moiety or with 32P, when considerable amounts of 1-acyl-lysophosphatidylcholine were found in the intestinal lumen. 2-([14C]Acyl)phosphatidylcholine gave markedly more radioactive unesterified fatty acids in the lumen, compared with the 1-([14C]acyl) derivative. Some of the radioactivity from either the fatty acid or the glycerol moiety of the injected phosphatidylcholine appeared in the mucosal triacylglycerols. 2. Injection of 32P-labelled phosphatidylcholine or 32P-labelled lysophosphatidylcholine led to the appearance of radioactive glycerylphosphorylcholine, glycerophosphate and Pi in the mucosa. 3. Rat mucosa was found to contain a highly active glycerylphosphorylcholine diesterase. 4. It was concluded that the dietary phosphatidylcholine is hydrolysed in the intestinal lumen by the pancreatic phospholipase A to 1-acylglycerylphosphorylcholine, which on entering the mucosal cell is partly reacylated to phosphatidylcholine, and the rest is further hydrolysed to glycerylphosphorylcholine, glycerophosphate, glycerol and Pi. The fatty acids and glycerophosphate are then reassembled to give triacylglycerols via the Kennedy (1961) pathway.
Resumo:
The use of UAVs for remote sensing tasks; e.g. agriculture, search and rescue is increasing. The ability for UAVs to autonomously find a target and perform on-board decision making, such as descending to a new altitude or landing next to a target is a desired capability. Computer-vision functionality allows the Unmanned Aerial Vehicle (UAV) to follow a designated flight plan, detect an object of interest, and change its planned path. In this paper we describe a low cost and an open source system where all image processing is achieved on-board the UAV using a Raspberry Pi 2 microprocessor interfaced with a camera. The Raspberry Pi and the autopilot are physically connected through serial and communicate via MAVProxy. The Raspberry Pi continuously monitors the flight path in real time through USB camera module. The algorithm checks whether the target is captured or not. If the target is detected, the position of the object in frame is represented in Cartesian coordinates and converted into estimate GPS coordinates. In parallel, the autopilot receives the target location approximate GPS and makes a decision to guide the UAV to a new location. This system also has potential uses in the field of Precision Agriculture, plant pest detection and disease outbreaks which cause detrimental financial damage to crop yields if not detected early on. Results show the algorithm is accurate to detect 99% of object of interest and the UAV is capable of navigation and doing on-board decision making.
Resumo:
4-Bromomethylcoumarins (1) reacted with sodium azide in aqueous acetone to give 4-azidomethyl-coumarins (2), which underwent 1,3-dipolar cycloaddition with acetylenic dipolarophiles to give triazoles (3). These triazoles (3) have been found to exhibit interesting variations in the chemical shifts of C-3-H and C-4-methylene protons. Protonation studies indicate that the shielding effect of the C-3-H of coumarin is due to pi-electrons of the triazole ring, further supported by diffraction and computational studies.
Resumo:
This paper proposes a control method that can balance the input currents of the three-phase three-wire boost rectifier under unbalanced input voltage condition. The control objective is to operate the rectifier in the high-power-factor mode under balanced input voltage condition but to give overriding priority to the current balance function in case of unbalance in the input voltage. The control structure has been divided into two major functional blocks. The inner loop current-mode controller implements resistor emulation to achieve high-power-factor operation on each of the two orthogonal axes of the stationary reference frame. The outer control loop performs magnitude scaling and phase-shifting operations on current of one of the axes to make it balanced with the current on the other axis. The coefficients of scaling and shifting functions are determined by two closed-loop prportional-integral (PI) controllers that impose the conditions of input current balance as PI references. The control algorithm is simple and high performing. It does not require input voltage sensing and transformation of the control variables into a rotating reference frame. The simulation results on a MATLAB-SIMULINK platform validate the proposed control strategy. In implementation Texas Instrument's digital signal processor TMS320F24OF is used as the digital controller. The control algorithm for high-power-factor operation is tested on a prototype boost rectifier under nominal and unbalanced input voltage conditions.
Resumo:
The design of present generation uncooled Hg1-xCdxTe infrared photon detectors relies on complex heterostructures with a basic unit cell of type (n) under bar (+)/pi/(p) under bar (+). We present an analysis of double barrier (n) under bar (+)/pi/(p) under bar (+) mid wave infrared (x = 0.3) HgCdTe detector for near room temperature operation using numerical computations. The present work proposes an accurate and generalized methodology in terms of the device design, material properties, and operation temperature to study the effects of position dependence of carrier concentration, electrostatic potential, and generation-recombination (g-r) rates on detector performance. Position dependent profiles of electrostatic potential, carrier concentration, and g-r rates were simulated numerically. Performance of detector was studied as function of doping concentration of absorber and contact layers, width of both layers and minority carrier lifetime. Responsivity similar to 0.38 A W-1, noise current similar to 6 x 10(-14) A/Hz(1/2) and D* similar to 3.1 x 10(10)cm Hz(1/2) W-1 at 0.1 V reverse bias have been calculated using optimized values of doping concentration, absorber width and carrier lifetime. The suitability of the method has been illustrated by demonstrating the feasibility of achieving the optimum device performance by carefully selecting the device design and other parameters. (C) 2010 American Institute of Physics. doi:10.1063/1.3463379]
Resumo:
gamma delta T-cell receptor-bearing T cells (gamma delta T cells) are readily activated by intracellular bacterial pathogens such as Mycobacterium tuberculosis. The bacterial antigens responsible for gamma delta T-cell activation remain poorly characterized. We have found that heat treatment of live M. tuberculosis bacilli released into the supernatant an antigen which stimulated human gamma delta T cells, gamma delta T-cell activation was measured by determining the increase in percentage of gamma delta T cells by flow cytometry in peripheral blood mononuclear cells stimulated with antigen and by proliferation of gamma delta T-cell lines with monocytes as antigen-presenting cells. Supernatant from heat-treated M. tuberculosis was fractionated by fast-performance liquid chromatography (FPLC) on a Superose 12 column. Maximal gamma delta T-cell activation was measured for a fraction of 10 to 14 kDa. Separation of the supernatant by preparative isoelectric focusing demonstrated peak activity at a pi of <4.0. On two-dimensional gel electrophoresis, the 10- to 14-kDa FPLC fraction contained at least seven distinct molecules, of which two had a pi of <4.5. Protease treatment reduced the bioactivity of the 10- to 14-kDa FPLC fraction for both resting and activated gamma delta T cells. Murine antibodies raised to the 10- to 14-kDa fraction reacted by enzyme-linked immunosorbent assay with antigens of 10 to 14 kDa in lysate of M. tuberculosis. In addition, gamma delta T cells proliferated in response to an antigen of 10 to 14 kDa present in M. tuberculosis lysate. gamma delta T-cell-stimulating antigen was not found in culture filtrate of M. tuberculosis but was associated,vith the bacterial pellet and lysate of M. tuberculosis. These results provide a preliminary characterization of a 10- to 14-kDa, cell-associated, heat-stable, low-pI protein antigen of M. tuberculosis which is a major stimulus for human gamma delta T cells.
Resumo:
Quantum cell models for delocalized electrons provide a unified approach to the large NLO responses of conjugated polymers and pi-pi* spectra of conjugated molecules. We discuss exact NLO coefficients of infinite chains with noninteracting pi-electrons and finite chains with molecular Coulomb interactions V(R) in order to compare exact and self-consistent-field results, to follow the evolution from molecular to polymeric responses, and to model vibronic contributions in third-harmonic-generation spectra. We relate polymer fluorescence to the alternation delta of transfer integrals t(1+/-delta) along the chain and discuss correlated excited states and energy thresholds of conjugated polymers.
Resumo:
Electron transfer is an essential activity in biological systems. The migrating electron originates from water-oxygen in photosynthesis and reverts to dioxygen in respiration. In this cycle two metal porphyrin complexes possessing circular conjugated system and macrocyclic pi-clouds, chlorophyll and hems, play a decisive role in mobilising electrons for travel over biological structures as extraneous electrons. Transport of electrons within proteins (as in cytochromes) and within DNA (during oxidative damage and repair) is known to occur. Initial evaluations did not favour formation of semiconducting pathways of delocalized electrons of the peptide bonds in proteins and of the bases in nucleic acids. Direct measurement of conductivity of bulk material and quantum chemical calculations of their polymeric structures also did not support electron transfer in both proteins and nucleic acids. New experimental approaches have revived interest in the process of charge transfer through DNA duplex. The fluorescence on photoexcitation of Ru-complex was found to be quenched by Rh-complex, when both were tethered to DNA and intercalated in the base stack. Similar experiments showed that damage to G-bases and repair of T-T dimers in DNA can occur by possible long range electron transfer through the base stack. The novelty of this phenomenon prompted the apt name, chemistry at a distance. Based on experiments with ruthenium modified proteins, intramolecular electron transfer in proteins is now proposed to use pathways that include C-C sigma-bonds and surprisingly hydrogen bonds which remained out of favour for a long time. In support of this, some experimental evidence is now available showing that hydrogen bond-bridges facilitate transfer of electrons between metal-porphyrin complexes. By molecular orbital calculations over 20 years ago. we found that "delocalization of an extraneous electron is pronounced when it enters low-lying virtual orbitals of the electronic structures of peptide units linked by hydrogen bonds". This review focuses on supramolecular electron transfer pathways that can emerge on interlinking by hydrogen bonds and metal coordination of some unnoticed structures with pi-clouds in proteins and nucleic acids, potentially useful in catalysis and energy missions.
Resumo:
Several pi-electron rich fluorescent aromatic compounds containing trimethylsilylethynyl functionality have been synthesized by employing Sonogashira coupling reaction and they were characterized fully by NMR (H-1, C-13)/IR spectroscopy. Incorporation of bulky trimethylsilylethynyl groups on the peripheral of the fluorophores prevents self-quenching of the initial intensity through pi-pi interaction and thereby maintains the spectroscopic stability in solution. These compounds showed fluorescence behavior in chloroform solution and were used as selective fluorescence sensors for the detection of electron deficient nitroaromatics. All these fluorophores showed the largest quenching response with high selectivity for nitroaromatics among the various electron deficient aromatic compounds tested. Quantitative analysis of the fluorescence titration profile of 9,10-bis(trimethylsilylethynyl) anthracene with picric acid provided evidence that this particular fluorophore detects picric acid even at ppb level. A sharp visual detection of 2,4,6-trinitrotoluene was observed upon subjecting 1,3,6,8-tetrakis (trimethylsilylethynyl) pyrene fluorophore to increasing quantities of 2,4,6-trinitrotoluene in chloroform. Furthermore, thin film of the fluorophores was made by spin coating of a solution of 1.0 x 10(-3) M in chloroform or dichloromethane on a quartz plate and was used for the detection of vapors of nitroaromatics at room temperature. The vapor-phase sensing experiments suggested that the sensing process is reproducible and quite selective for nitroaromatic compounds. Selective fluorescence quenching response including a sharp visual color change for nitroaromatics makes these fluorophores as promising fluorescence sensory materials for nitroaromatic compounds (NAC) with a detection limit of even ppb level as judged with picric acid.
Resumo:
The problem of on-line recognition and retrieval of relatively weak industrial signals such as partial discharges (PD), buried in excessive noise, has been addressed in this paper. The major bottleneck being the recognition and suppression of stochastic pulsive interference (PI) due to the overlapping broad band frequency spectrum of PI and PD pulses. Therefore, on-line, onsite, PD measurement is hardly possible in conventional frequency based DSP techniques. The observed PD signal is modeled as a linear combination of systematic and random components employing probabilistic principal component analysis (PPCA) and the pdf of the underlying stochastic process is obtained. The PD/PI pulses are assumed as the mean of the process and modeled instituting non-parametric methods, based on smooth FIR filters, and a maximum aposteriori probability (MAP) procedure employed therein, to estimate the filter coefficients. The classification of the pulses is undertaken using a simple PCA classifier. The methods proposed by the authors were found to be effective in automatic retrieval of PD pulses completely rejecting PI.