Measurement of charged jet suppression in Pb-Pb collisions at √sNN = 2.76 TeV
- A measurement of the transverse momentum spectra of jets in Pb-Pb collisions at sNN−−−√=2.76 TeV is reported. Jets are reconstructed from charged particles using the anti-kT jet algorithm with jet resolution parameters R of 0.2 and 0.3 in pseudo-rapidity |η|<0.5. The transverse momentum pT of charged particles is measured down to 0.15 GeV/c which gives access to the low pT fragments of the jet. Jets found in heavy-ion collisions are corrected event-by-event for average background density and on an inclusive basis (via unfolding) for residual background fluctuations and detector effects. A strong suppression of jet production in central events with respect to peripheral events is observed. The suppression is found to be similar to the suppression of charged hadrons, which suggests that substantial energy is radiated at angles larger than the jet resolution parameter R=0.3 considered in the analysis. The fragmentation bias introduced by selecting jets with a high pT leading particle, which rejects jets with a soft fragmentation pattern, has a similar effect on the jet yield for central and peripheral events. The ratio of jet spectra with R=0.2 and R=0.3 is found to be similar in Pb-Pb and simulated PYTHIA pp events, indicating no strong broadening of the radial jet structure in the reconstructed jets with R<0.3.
Microwave radar imaging of heterogeneous breast tissue integrating a priori information
Thomas N. Kelly
Ian J. Craddock
- Conventional radar-based image reconstruction techniques fail when they are applied to heterogeneous breast tissue, since the underlying in-breast relative permittivity is unknown or assumed to be constant. This results in a systematic error during the process of image formation. A recent trend in microwave biomedical imaging is to extract the relative permittivity from the object under test to improve the image reconstruction quality and thereby to enhance the diagnostic assessment. In this paper, we present a novel radar-based methodology for microwave breast cancer detection in heterogeneous breast tissue integrating a 3D map of relative permittivity as a priori information. This leads to a novel image reconstruction formulation where the delay-and-sum focusing takes place in time rather than range domain. Results are shown for a heterogeneous dense (class-4) and a scattered fibroglandular (class-2) numerical breast phantom using Bristol's 31-element array configuration.
The final stage of gravitationally collapsed thick matter layers
- In the presence of a minimal length, physical objects cannot collapse to an infinite density, singular, matter point. In this paper, we consider the possible final stage of the gravitational collapse of “thick” matter layers. The energy momentum tensor we choose to model these shell-like objects is a proper modification of the source for “noncommutative geometry inspired,” regular black holes. By using higher momenta of Gaussian distribution to localize matter at finite distance from the origin, we obtain new solutions of the Einstein equation which smoothly interpolates between Minkowski’s geometry near the center of the shell and Schwarzschild’s spacetime far away from the matter layer. The metric is curvature singularity free. Black hole type solutions exist only for “heavy” shells; that is, M >= Mρ, where Mρ is the mass of the extremal configuration. We determine the Hawking temperature and a modified area law taking into account the extended nature of the source.
On unitary evolution and collapse in quantum mechanics
- In the framework of an interference setup in which only two outcomes are possible (such as in the case of a Mach–Zehnder interferometer), we discuss in a simple and pedagogical way the difference between a standard, unitary quantum mechanical evolution and the existence of a real collapse of the wavefunction. This is a central and not-yet resolved question of quantum mechanics and indeed of quantum field theory as well. Moreover, we also present the Elitzur–Vaidman bomb, the delayed choice experiment, and the effect of decoherence. In the end, we propose two simple experiments to visualize decoherence and to test the role of an entangled particle.
Reliability and current-adaptability studies of a 352 MHz, 17 MeV, continuous-wave injector for an accelerator-driven system
- EUROTRANS is a European research program for the transmutation of high level nuclear waste in an accelerator-driven system (ADS). As proposed, the driver linac needs to deliver a 2.5–4 mA, 600 MeV continuous-wave (CW) proton beam and later a 20 mA, 800 MeV one to the spallation target in the prototype-scale and industrial-scale demonstration phases, respectively. This paper is focusing on the conceptual studies performed with respect to the 17 MeV injector. First, the special beam dynamics strategies and methods, which have been developed and applied to design a current-variable injector up to 30 mA for allowing an easy upgrade without additional R&D costs, will be introduced. Then the error study made for evaluating the tolerance limits of the designed injector will be presented as well.
Focus on quantum efficiency
Gregory D. Scholes
Ulrich T. Schwarz
- Technologies which convert light into energy, and vice versa, rely on complex, microscopic transport processes in the condensed phase, which obey the laws of quantum mechanics, but hitherto lack systematic analysis and modeling. Given our much improved understanding of multicomponent, disordered, highly structured, open quantum systems, this ‘focus on’ collection collects cuttingedge research on theoretical and experimental aspects of quantum transport in truly complex systems as defined, e.g., by the macromolecular functional complexes at the heart of photosynthesis, by organic quantum wires, or even photovoltaic devices. To what extent microscopic quantum coherence effects can (be made to) impact on macroscopic transport behavior is an equally challenging and controversial question, and this ‘focus on’ collection provides a setting for the present state of affairs, as well as for the ‘quantum opportunities’ on the horizon.
In silico analysis of cell cycle synchronisation effects in radiotherapy of tumour spheroids
- Abstract: Tumour cells show a varying susceptibility to radiation damage as a function of the current cell cycle phase. While this sensitivity is averaged out in an unperturbed tumour due to unsynchronised cell cycle progression, external stimuli such as radiation or drug doses can induce a resynchronisation of the cell cycle and consequently induce a collective development of radiosensitivity in tumours. Although this effect has been regularly described in experiments it is currently not exploited in clinical practice and thus a large potential for optimisation is missed. We present an agent-based model for three-dimensional tumour spheroid growth which has been combined with an irradiation damage and kinetics model. We predict the dynamic response of the overall tumour radiosensitivity to delivered radiation doses and describe corresponding time windows of increased or decreased radiation sensitivity. The degree of cell cycle resynchronisation in response to radiation delivery was identified as a main determinant of the transient periods of low and high radiosensitivity enhancement. A range of selected clinical fractionation schemes is examined and new triggered schedules are tested which aim to maximise the effect of the radiation-induced sensitivity enhancement. We find that the cell cycle resynchronisation can yield a strong increase in therapy effectiveness, if employed correctly. While the individual timing of sensitive periods will depend on the exact cell and radiation types, enhancement is a universal effect which is present in every tumour and accordingly should be the target of experimental investigation. Experimental observables which can be assessed non-invasively and with high spatio-temporal resolution have to be connected to the radiosensitivity enhancement in order to allow for a possible tumour-specific design of highly efficient treatment schedules based on induced cell cycle synchronisation.
Author Summary: The sensitivity of a cell to a dose of radiation is largely affected by its current position within the cell cycle. While under normal circumstances progression through the cell cycle will be asynchronous in a tumour mass, external influences such as chemo- or radiotherapy can induce a synchronisation. Such a common progression of the inner clock of the cancer cells results in the critical dependence on the effectiveness of any drug or radiation dose on a suitable timing for its administration. We analyse the exact evolution of the radiosensitivity of a sample tumour spheroid in a computer model, which enables us to predict time windows of decreased or increased radiosensitivity. Fractionated radiotherapy schedules can be tailored in order to avoid periods of high resistance and exploit the induced radiosensitivity for an increase in therapy efficiency. We show that the cell cycle effects can drastically alter the outcome of fractionated irradiation schedules in a spheroid cell system. By using the correct observables and continuous monitoring, the cell cycle sensitivity effects have the potential to be integrated into treatment planing of the future and thus to be employed for a better outcome in clinical cancer therapies.
From p+p to Pb+Pb collisions : wounded nucleon versus statistical models
- System size dependence of hadron production properties is discussed within the Wounded Nucleon Model and the Statistical Model in the grand canonical, canonical and micro-canonical formulations. Similarities and differences between predictions of the models related to the treatment of conservation laws are exposed. A need for models which would combine a hydrodynamicallike expansion with conservation laws obeyed in individual collisions is stressed.
Generating functionals for autonomous latching dynamics in attractor relict networks
- Coupling local, slowly adapting variables to an attractor network allows to destabilize all attractors, turning them into attractor ruins. The resulting attractor relict network may show ongoing autonomous latching dynamics. We propose to use two generating functionals for the construction of attractor relict networks, a Hopfield energy functional generating a neural attractor network and a functional based on information-theoretical principles, encoding the information content of the neural firing statistics, which induces latching transition from one transiently stable attractor ruin to the next. We investigate the influence of stress, in terms of conflicting optimization targets, on the resulting dynamics. Objective function stress is absent when the target level for the mean of neural activities is identical for the two generating functionals and the resulting latching dynamics is then found to be regular. Objective function stress is present when the respective target activity levels differ, inducing intermittent bursting latching dynamics.
Neuropsychological constraints to human data production on a global scale
- Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252–633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284–675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.
Author summary: The generation of new information is limited by two key factors, by the incurring economic costs and by the capacity of the human brain to process and store data and information; the controlling agent needs to retain an overall understanding even when data is generated by semiautomatic processes. These processes are reflected in the statistical properties of the data files publicly available on the Internet. Collecting a corpus of 252–633 mil. files we find that the statistics of the file size distribution are consistent with the supposition that data production on a global level is shaped and limited by the neuropsychological information processing capacity of the brain, with economic and hardware constraints having a negligible influence.