// The string `s` only contains digits.
var factor int
for i, c := range s[:12] {
if i % 2 == 0 {
factor = 1
} else {
factor = 3
}
buf := make([]byte, 1)
_ = utf8.EncodeRune(buf, c)
value, _ := strconv.Atoi(string(buf))
sum += value * factor
}
The problem is simpler than it looks. You convert a rune
value to an int
value with int(r)
. But your code implies you want the integer value out of the ASCII (or UTF-8) representation of the digit, which you can trivially get with r - '0'
as a rune
, or int(r - '0')
as an int
. Be aware that out-of-range runes will corrupt that logic.