What's the simplest way to convert from a single character String to an ASCII value in Swift?

I just want to get the ASCII value of a single char string in Swift. This is how I'm currently doing it:

var singleChar = "a"
println(singleChar.unicodeScalars[singleChar.unicodeScalars.startIndex].value) //prints: 97

This is so ugly though. There must be a simpler way.

60557 次浏览

You can use NSString's characterAtIndex to accomplish this...

var singleCharString = "a" as NSString
var singleCharValue = singleCharString.characterAtIndex(0)
println("The value of \(singleCharString) is \(singleCharValue)")  // The value of a is 97

edit/update Swift 5.2 or later

extension StringProtocol {
var asciiValues: [UInt8] { compactMap(\.asciiValue) }
}

"abc".asciiValues  // [97, 98, 99]

In Swift 5 you can use the new character properties isASCII and asciiValue

Character("a").isASCII       // true
Character("a").asciiValue    // 97


Character("á").isASCII       // false
Character("á").asciiValue    // nil

Old answer

You can create an extension:

Swift 4.2 or later

extension Character {
var isAscii: Bool {
return unicodeScalars.allSatisfy { $0.isASCII }
}
var ascii: UInt32? {
return isAscii ? unicodeScalars.first?.value : nil
}
}

extension StringProtocol {
var asciiValues: [UInt32] {
return compactMap { $0.ascii }
}
}

Character("a").isAscii  // true
Character("a").ascii    // 97


Character("á").isAscii  // false
Character("á").ascii    // nil


"abc".asciiValues            // [97, 98, 99]
"abc".asciiValues[0]         // 97
"abc".asciiValues[1]         // 98
"abc".asciiValues[2]         // 99

The way you're doing it is right. If you don't like the verbosity of the indexing, you can avoid it by cycling through the unicode scalars:

var x : UInt32 = 0
let char = "a"
for sc in char.unicodeScalars {x = sc.value; break}

You can actually omit the break in this case, of course, since there is only one unicode scalar.

Or, convert to an Array and use Int indexing (the last resort of the desperate):

let char = "a"
let x = Array(char.unicodeScalars)[0].value

A slightly shorter way of doing this could be:

first(singleChar.unicodeScalars)!.value

As with the subscript version, this will crash if your string is actually empty, so if you’re not 100% sure, use the optional:

if let ascii = first(singleChar.unicodeScalars)?.value {


}

Or, if you want to be extra-paranoid,

if let char = first(singleChar.unicodeScalars) where char.isASCII() {
let ascii = char.value
}

Here's my implementation, it returns an array of the ASCII values.

extension String {


func asciiValueOfString() -> [UInt32] {


var retVal = [UInt32]()
for val in self.unicodeScalars where val.isASCII() {
retVal.append(UInt32(val))
}
return retVal
}
}

Note: Yes it's Swift 2 compatible.

Now in Xcode 7.1 and Swift 2.1

var singleChar = "a"


singleChar.unicodeScalars.first?.value
UnicodeScalar("1")!.value // returns 49

Swift 3.1

var singchar = "a" as NSString


print(singchar.character(at: 0))

Swift 3.1

Swift 4

print("c".utf8["c".utf8.startIndex])

or

let cu = "c".utf8
print(cu[cu.startIndex])

Both print 99. Works for any ASCII character.

There's also the UInt8(ascii: Unicode.Scalar) initializer on UInt8.

var singleChar = "a"
UInt8(ascii: singleChar.unicodeScalars[singleChar.startIndex])

Swift 4.1

https://oleb.net/blog/2017/11/swift-4-strings/

let flags = "99_problems"
flags.unicodeScalars.map {
"\(String($0.value, radix: 16, uppercase: true))"
}

Result:

["39", "39", "5F", "70", "72", "6F", "62", "6C", "65", "6D", "73"]

Swift 4.2

The easiest way to get ASCII values from a Swift string is below

let str = "Swift string"
for ascii in str.utf8 {
print(ascii)
}

Output:

83
119
105
102
116
32
115
116
114
105
110
103

Swift 4+

Char to ASCII

let charVal = String(ch).unicodeScalars
var asciiVal = charVal[charVal.startIndex].value

ASCII to Char

let char = Character(UnicodeScalar(asciiVal)!)

With Swift 5, you can pick one of the following approaches in order to get the ASCII numeric representation of a character.


#1. Using Character's asciiValue property

Character has a property called asciiValue. asciiValue has the following declaration:

var asciiValue: UInt8? { get }

The ASCII encoding value of this character, if it is an ASCII character.

The following Playground sample codes show how to use asciiValue in order to get the ASCII encoding value of a character:

let character: Character = "a"
print(character.asciiValue) //prints: Optional(97)
let string = "a"
print(string.first?.asciiValue) //prints: Optional(97)
let character: Character = "👍"
print(character.asciiValue) //prints: nil

#2. Using Character's isASCII property and Unicode.Scalar's value property

As an alternative, you can check that the first character of a string is an ASCII character (using Character's isASCII property) then get the numeric representation of its first Unicode scalar (using Unicode.Scalar's value property). The Playground sample code below show how to proceed:

let character: Character = "a"
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let string = "a"
if let character = string.first, character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let character: Character = "👍"
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: Not an ASCII character
*/

var input = "Swift".map { Character(extendedGraphemeClusterLiteral: $0).asciiValue! }

// [83, 119, 105, 102, 116]