Keyword: optics
Paper Title Other Keywords Page
SUPAG01 Space Charge and Transverse Instabilities at the CERN SPS and LHC coupling, simulation, space-charge, impedance 80
 
  • E. Métral, D. Amorim, G. Arduini, H. Bartosik, E. Benedetto, H. Burkhardt, K.S.B. Li, A. Oeftiger, D. Quatraro, G. Rumolo, B. Salvant, C. Zannini
    CERN, Geneva, Switzerland
 
  At the CERN accelerator complex, it seems that only the highest energy machine in the sequence, the LHC, with space charge (SC) parameter close to one, sees the predicted beneficial effect of SC on transverse coherent instabilities. In the other circular machines of the LHC injector chain (PSB, PS and SPS), where the SC parameter is much bigger than one, SC does not seem to play a major (stabilising) role, and it is maybe the opposite in the SPS. All the measurements and simulations performed so far in both the SPS and LHC will be reviewed and analysed in detail.  
slides icon Slides SUPAG01 [37.523 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-SUPAG01  
About • paper received ※ 20 October 2018       paper accepted ※ 19 November 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
SUPLG01 Computational Accelerator Physics: On the Road to Exascale simulation, space-charge, plasma, radiation 113
 
  • R.D. Ryne
    LBNL, Berkeley, USA
 
  The first conference in what would become the ICAP series was held in 1988. At that time the most powerful computer in the world was a Cray YMP with 8 processors and a peak performance of 2 gigaflops. Today the fastest computer in the world has more than 2 million cores and a theoretical peak performance of nearly 200 petaflops. Compared to 1988, performance has increased by a factor of 100 million, accompanied by huge advances in memory, networking, big data management and analytics. By the time of the next ICAP in 2021 we will be at the dawn of the Exascale era. In this talk I will describe the advances in Computational Accelerator Physics that brought us to this point and describe what to expect in regard to High Performance Computing in the future. This writeup as based on my presentation at ICAP’18 along with some additional comments that I did not include originally due to time constraints.  
slides icon Slides SUPLG01 [25.438 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-SUPLG01  
About • paper received ※ 14 November 2018       paper accepted ※ 07 December 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MOPAF05 Approaches to Optimizing Spin Transmission in Lattice Design resonance, lattice, polarization, emittance 151
 
  • V.H. Ranjbar
    BNL, Upton, Long Island, New York, USA
 
  Funding: Work supported by Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy.
We present our experiences in optimizing the proposed Rapid Cycling Synchrotron (RCS) injector for the eRHIC Storage ring and the RHIC 2017 lattice. We have develop python code to drive lattice calculations in MADX which are then used to calculate spin resonances using the DEPOL algorithm. This approach has been used to minimize intrinsic spin resonances during the RCS acceleration cycle while controlling lattice parameters such as dispersion and beta functions. This approach has also been used to construct localized imperfection bumps using a spin response matrix and SVD. This approach has also been used to reduce interfering intrinsic spin resonances during the RHIC acceleration ramp.
 
slides icon Slides MOPAF05 [1.333 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-MOPAF05  
About • paper received ※ 17 October 2018       paper accepted ※ 24 October 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPAF02 SixTrack Project: Status, Runtime Environment, and New Developments simulation, scattering, HOM, collimation 172
 
  • R. De Maria, J. Andersson, L. Field, M. Giovannozzi, P.D. Hermes, N. Hoimyr, G. Iadarola, S. Kostoglou, E.H. Maclean, E. McIntosh, A. Mereghetti, J. Molson, V.K.B. Olsen, D. Pellegrini, T. Persson, M. Schwinzerl, K.N. Sjobak
    CERN, Geneva, Switzerland
  • E.H. Maclean
    University of Malta, Information and Communication Technology, Msida, Malta
  • S. Singh
    Indian Institute of Technology Madras, Chennai, India
  • K.N. Sjobak
    University of Oslo, Oslo, Norway
  • I. Zacharov
    EPFL, Lausanne, Switzerland
 
  Funding: Research supported by the HL-LHC project and Google Summer of Code 2018.
SixTrack is a single-particle tracking code for high-energy circular accelerators routinely used at CERN for the Large Hadron Collider (LHC), its luminosity upgrade (HL-LHC), the Future Circular Collider (FCC), and the Super Proton Synchrotron (SPS) simulations. The code is based on a 6D symplectic tracking engine, which is optimised for long-term tracking simulations and delivers fully reproducible results on several platforms. It also includes multiple scattering engines for beam-matter interaction studies, as well as facilities to run integrated simulations with FLUKA and GEANT4. These features differentiate SixTrack from general-purpose, optics-design software like MAD-X. The code recently underwent a major restructuring to merge advanced features into a single branch, such as multiple ion species, interface with external codes, and high-performance input/output (XRootD, HDF5). This restructuring also removed a large number of build flags, instead enabling/disabling the functionality at run-time. In the process, the code was moved from Fortran 77 to Fortran 2018 standard, also allowing and achieving a better modularization. Physics models (beam-beam effects, RF-multipoles, current carrying wires, solenoid, and electron lenses) and methods (symplecticity check) have also been reviewed and refined to offer more accurate results. The SixDesk runtime environment allows the user to manage the large batches of simulations required for accurate predictions of the dynamic aperture. SixDesk supports CERN LSF and HTCondor batch systems, as well as the BOINC infrastructure in the framework of the LHC@Home volunteering computing project. SixTrackLib is a new library aimed at providing a portable and flexible tracking engine for single- and multi-particle problems using the models and formalism of SixTrack. The tracking routines are implemented in a parametrized C code that is specialised to run vectorized in CPUs and GPUs, by using SIMD intrinsics, OpenCL 1.2, and CUDA tech
 
slides icon Slides TUPAF02 [0.938 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-TUPAF02  
About • paper received ※ 18 October 2018       paper accepted ※ 24 October 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPAF08 A Full Field-Map Modeling of Cornell-BNL CBETA 4-Pass Energy Recovery Linac FFAG, linac, dipole, simulation 186
 
  • F. Méot, S.J. Brooks, D. Trbojevic, N. Tsoupas
    BNL, Upton, Long Island, New York, USA
  • J.A. Crittenden
    Cornell University (CLASSE), Cornell Laboratory for Accelerator-Based Sciences and Education, Ithaca, New York, USA
 
  Funding: Work supported by Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy
The Cornell-BNL Electron Test Accelerator (CBETA) is a four-pass, 150 MeV energy recovery linac (ERL), now in construction at Cornell. A single fixed-field alternating gradient (FFAG) beam line recirculates the four energies, 42, 78, 114 and 150 MeV. The return loop is comprised of 107 quadrupole-doublet cells, built using Halbach permanent magnet technology. Spreader and combiner sections (4 independent beam lines each) connect the 36 MeV linac to the FFAG loop. We present here a start-to-end simulation of the 4-pass ERL, entirely, and exclusively, based on the use of magnetic field maps to model the magnets and correctors. There are paramount reasons for that and this is discussed, detailed outcomes are presented, together with comparisons with regular beam transport (mapping based) techniques.
 
slides icon Slides TUPAF08 [2.568 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-TUPAF08  
About • paper received ※ 23 October 2018       paper accepted ※ 07 December 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPAF13 Calculation of the AGS Optics Based on 3D Fields Derived From Experimentally Measured Fields on Median Plane extraction, closed-orbit, kicker, focusing 209
 
  • N. Tsoupas, J.S. Berg, S.J. Brooks, F. Méot, V. Ptitsyn, D. Trbojevic
    BNL, Upton, Long Island, New York, USA
 
  Funding: Work supported by the US Department of Energy
Closed orbit calculations of the AGS synchrotron were performed and the beam parameters at the extraction point of the AGS [1] were calculated using the RAYTRACE computer code [2] which was modified to generate 3D fields from the experimentally measured field maps on the median plane of the AGS combined function magnets. The algorithm which generates 3D fields from field maps on a plane is described in reference [3] which discusses the details of the mathematical foundation of this approach. In this presentation we will discuss results from studies [1,4] that are based on the 3D fields generated from the known field components on a rectangular grid of a plane. A brief overview of the algorithm used will be given, and two methods of calculating the required field derivatives on the plane will be presented. The calculated 3D fields of a modified Halbach magnet [5] of inner radius of 4.4 cm will be calculated using the two different methods of calculating the field derivatives on the plane and the calculated fields will be compared against the ’ideal’ fields as calculated by the OPERA computer code [6]. [1] N. Tsoupas et. al. ’Closed orbit calculations at AGS and Extraction Beam Parameters at H13 AD/RHIC/RD-75 Oct. 1994 [2] S.B. Kowalski and H.A. Enge ’The Ion-Optical Program Raytrace’ NIM A258 (1987) 407 [3] K. Makino, M. Berz, C. Johnstone, Int. Journal of Modern Physics A 26 (2011) 1807-1821 [4] N. Tsoupas et. al. ’Effects of Dipole Magnet Inhomogeneity on the Beam Ellipsoid’ NIM A258 (1987) 421-425 [5] ’The CBETA project: arXiv.org > physics > arXiv:1706.04245’’ [6] Vector Fields Inc. https://operafea.com/
 
slides icon Slides TUPAF13 [1.772 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-TUPAF13  
About • paper received ※ 20 October 2018       paper accepted ※ 07 December 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPAG10 Nonlinear Optics at UMER: Lessons Learned in Simulation octupole, lattice, simulation, resonance 278
 
  • K.J. Ruisard, B.L. Beaudoin, I. Haber, T.W. Koeth, D.B. Matthew
    UMD, College Park, Maryland, USA
 
  Funding: Funding through DOEHEP Award DESC0010301, NSF Award PHY1414681, NSF GRFP program. Manuscript authored by UT-Battelle, LLC, under Contract No. DEAC0500OR22725 with the U.S. Department of Energy.
Invited talk: Design of accelerator lattices with nonlinear optics to suppress transverse resonances is a novel approach and may be crucial for enabling low-loss high-intensity beam transport. Large amplitude-dependent tune spreads, driven by nonlinear field inserts, damp resonant response to driving terms. This presentation will focus on simulations of the UMER lattice operated as a quasi-integrable system (1 invariant of transverse motion) with a single strong octupole insert. We will discuss the evolution of simulation models, including the observation of losses associated with the original operating point near a fourth-order resonance. Other operating points farther from this resonance are considered and shown to be more promising.
 
slides icon Slides TUPAG10 [3.447 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-TUPAG10  
About • paper received ※ 19 October 2018       paper accepted ※ 28 January 2019       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPAG17 Beamline Map Computation for Paraxial Optics FEL, radiation, synchrotron, electron 297
 
  • B. Nash, J.P. Edelen, N.B. Goldring, S.D. Webb
    RadiaSoft LLC, Boulder, Colorado, USA
 
  Funding: Department of Energy office of Basic energy sciences, DE-SC0018571
Modeling of radiation transport is an important topic tightly coupled to many charged particle dynamics simulations for synchrotron light sources and FEL facilities. The radiation is determined by the electron beam and magnetic field source, and then passes through beamlines with focusing elements, apertures and monochromators, in which one may typically apply the paraxial approximation of small angular deviations from the optical axis. The radiation is then used in a wide range of spectroscopic experiments, or else may be recirculated back to the electron beam source, in the case of an FEL oscillator. The Wigner function representation of electromagnetic wavefronts has been described in the literature and allows a phase space description of the radiation, similar to that used in charged particle dynamics. It can encompass both fully and partially coherent cases, as well as polarization. Here, we describe the calculation of a beamline map that can be applied to the radiation Wigner function, reducing the computation time. We discuss the use of ray tracing and wave optics codes for the map computation and benchmarking. We construct a four crystal 1:1 imaging beamline that could be used for recirculation in an XFEL oscillator, and benchmark the map based results with SRW wavefront simulations.
 
slides icon Slides TUPAG17 [2.289 MB]  
DOI • reference for this paper ※ https://doi.org/10.18429/JACoW-ICAP2018-TUPAG17  
About • paper received ※ 19 October 2018       paper accepted ※ 18 December 2018       issue date ※ 26 January 2019  
Export • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)