• Iron Lynx
      link
      fedilink
      7
      edit-2
      2 months ago

      ASCII was originally a 7-bit standard. If you type in ASCII on an 8-bit system, every leading bit is always 0.

      (Edited to specify context)

      At least ASCII is forward compatible with UTF-8

    • @houseofleft@slrpnk.net
      link
      fedilink
      English
      42 months ago

      Ascii needs seven bits, but is almost always encoded as bytes, so every ascii letter has a throwaway bit.

        • @anton@lemmy.blahaj.zone
          link
          fedilink
          12 months ago

          That boolean can indicate if it’s a fancy character, that way all ASCII characters are themselves but if the boolean is set it’s something else. We could take the other symbol from a page of codes to fit the users language.
          Or we could let true mean that the character is larger, allowing us to transform all of unicode to a format consisting of 8 bits parts.

      • @FuckBigTech347@lemmygrad.ml
        link
        fedilink
        12 months ago

        Some old software does use 8-Bit ASCII for special/locale specific characters. Also there is this Unicode hack where the last bit is used to determine if the byte is part of a multi-byte sequence.