To convert integer types in Swift while preserving bit patterns, use the following initializer.

```
init<T>(truncatingIfNeeded source: T) where T : BinaryInteger
```

This can be used to convert between unsigned and signed integers. For example, `0x00`

, `0x7F`

, `0x80`

, `0xFF`

in `UInt8`

would look like this in decimal.

Hex | Decimal in UInt8 |
---|---|

0x00 | 0 |

0x7F | 127 |

0x80 | 128 |

0xFF | 255 |

The same value for `Int8`

would look like this.

Hex | Decimal in Int8 |
---|---|

0x00 | 0 |

0x7F | 127 |

0x80 | -128 |

0xFF | -1 |

You can confirm that you get the expected conversion by typing the following in Playground and running it.

```
let u1 = UInt8(0x00)
let u2 = UInt8(0x7F)
let u3 = UInt8(0x80)
let u4 = UInt8(0xFF)
let i1 = Int8(truncatingIfNeeded: u1)
let i2 = Int8(truncatingIfNeeded: u2)
let i3 = Int8(truncatingIfNeeded: u3)
let i4 = Int8(truncatingIfNeeded: u4)
```

If no argument label is specified, the next initializer is used.

```
init<T>(_ source: T) where T : BinaryInteger
```

This initializer can also do the conversion, but an error occurs when the value is out of range. Specifically, the conversion from `UInt8`

to `Int8`

results in an error for values equal or greater than `128`

.

Run the following code in Playground.

```
let u1 = UInt8(0x00)
let u2 = UInt8(0x7F)
let u3 = UInt8(0x80)
let u4 = UInt8(0xFF)
let i1 = Int8(u1)
let i2 = Int8(u2)
let i3 = Int8(u3)
let i4 = Int8(u4)
```

Then the following is output to the console and an error occurs when making `Int8`

from `u3`

.

```
Swift/Integers.swift:3564: Fatal error: Not enough bits to represent the passed value
```

`init(truncatingIfNeeded:)`

succeeds because it converts the type while preserving the bit pattern of the number. `128`

in `UInt8`

and `-128`

in `Int8`

are both `10000000`

in binary.

`init(_ source:)`

occurs out of range error because it interprets numeric values.The possible values for `Int8`

are between `-128`

and `127`

. `u3`

is `128`

, so it is out of range.