#RubyOnRails - 23 February 2019
« Back 1 day Forward 1 day »
[05:50:04] hahuang65: hmm can someone explain to me why rails is connecting thru TCP socket when I don't have the host and port lines uncommented in database.yml?
[05:50:21] hahuang65: Trying to figure out why on my laptop it's doing that, but on my desktop, it's looking for domain socket by default.
[11:23:26] catbusters: `create_table "friendly_id_slugs", id: :integer, default: nil, force: :cascade do |t|`
[11:55:51] dionysus70: it is considered ethical to use pieces of code in multiple project? I have written them but they are for different clients
[11:56:54] dionysus70: login / registration views would be quite similar with different branding of course
[12:58:17] syndikate: If my clean up strategy is truncation does it truncate only the data added in that test or the whole table?
[16:00:24] randohinn: Is there an efficient way to search from a HUUGE xml dataset in ruby? several tens of thousands of rows
[16:06:50] havenwood: randohinn: Nokogiri has stream parsing too. Oga is another good mention for stream parsing.
[16:28:39] randohinn: If only I could do requests to the db this dataset is from, without a contract
[18:06:30] randohinn: Getting a NoMethodError: undefined method `to_sym' for [:unique, true]:Array when adding an index
[18:06:59] randohinn: to a db that will not have a model (I only need to run one simple query on it)
[18:08:12] catbusters: Maybe you could use https://api.rubyonrails.org/classes/ActiveRecord/ConnectionAdapters/DatabaseStatements.html
[18:12:11] ule: randohinn: What do you mean by "make a DB"? Create a table in your Database and not necessary have a model class for it?
[18:15:26] randohinn: I'm literally going to be querying for a single field from that table, to save myself from trying to search form 290k rows of an xml file, instead
[18:16:16] ule: lol I did something similar here, but it was to avoid iterating over thousans of lines of an CSV
[18:17:04] ule: randohinn: here, we basically dump the CSV file into a temporary table, then we trigger workers + Sidekiq using a gem that does batch processing for sidekiq to digest that entire list in parallel
[18:17:07] randohinn: yup.. API access to this dataset costs, but they provide a download for free :D
[18:18:22] ule: randohinn: I saw a tweet from UncleBob this morning critizising XML. He said it is a terrible thing