[Home] [By Thread] [By Date] [Recent Entries]
Hans-Juergen Rennau <hrennau@y...> writes:
> Roger, I would find it interesting to compare an awk solution with an
> XQuery one, also considering aspects like clarity and
> extensibility. Especially interesting as the potential of XQuery for
> tool building is by and large ignored.
Agreed!
> ...
>
> PS. Example of an XQuery-based solution:
>
> declare variable $uri external;
> declare variable $sep external := '	';
> <document>{
> let $lines := unparsed-text-lines($uri)
> let $names := $lines => head() => tokenize($sep)
> for $line in tail($lines) return
> <row>{
> for $field at $pos in tokenize($line, $sep) return
> element {$names[$pos]} {$field}
> }</row>
> }</document>
This is good (and should work anywhere), but after spending a little
time on my own CSV parsing routines I realized that in BaseX, the
simplest thing to do is just to call
csv:parse(unparsed-text($uri), map { 'header': 'yes'})
That is for comma-separated values; I think for tab-separated values one
would have to specify an additional option.
I don't have time to check, but I have a dim recollection that eXist
also has a function for reading CSV.
--
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
http://blackmesatech.com
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] |

Cart



