apache spark sql - Scala extraction/pattern matching using a companion object -
the below code taken spark sql. performs extraction, casttype
companion object - 1 call typecast.castto("abc", stringtype)
. please explain how pattern matching works companion objects under hood?
private[csv] def castto(datum: string, casttype: datatype): = { casttype match { case _: bytetype => datum.tobyte case _: shorttype => datum.toshort case _: integertype => datum.toint case _: longtype => datum.tolong case _: floattype => datum.tofloat case _: doubletype => datum.todouble case _: booleantype => datum.toboolean case _: decimaltype => new bigdecimal(datum.replaceall(",", "")) case _: timestamptype => timestamp.valueof(datum) case _: datetype => date.valueof(datum) case _: stringtype => datum case _ => throw new runtimeexception(s"unsupported type: ${casttype.typename}") } }
added: based on understanding extractin/pattern matching implemented unapply method of companion object. example of how case classes implemented under hood
trait user { def name: string } class freeuser(val name: string) extends user class premiumuser(val name: string) extends user object freeuser { def unapply(user: freeuser): option[string] = some(user.name) } object premiumuser { def unapply(user: premiumuser): option[string] = some(user.name) }
i don't understand how same thing can done companion classes
i have never worked spark, cannot answer specific spark knowledge.
but pattern matching in example seems ordinary type based pattern matching.
casttype
instance of type datatype
has several subclasses e.g. bytetype
. pattern matching tests whether casttype
object belongs specific class (e.g. bytetype
).
because there singleton instances of several sub-types, matching not need provide matches instance @ all. why match uses placeholder _
.
Comments
Post a Comment