Initial Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

LLM Generation

Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

Output

Here's the JSON metadata for the models you've listed, following the format you specified:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "RedPajama", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "RedPajama", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}
This JSON structure includes all the models you listed, with their respective sizes in billions of parameters, an empty string for the dataset (except for RedPajama models), and the appropriate family name from the provided list. The gpt2-xl model is correctly listed with 1.5 billion parameters as specified.

Extract Json

I extracted the following JSON from that output:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "RedPajama", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "RedPajama", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}

JSON Subset Evaluator

Testing if the previous output matches the JSON: { "Mistral-7B-v0.1": { "size": 7, "dataset": "", "family": "Mistral" }, "RedPajama-INCITE-7B-Base": { "size": 7, "dataset": "", "family": "RedPajama" }, "RedPajama-INCITE-Base-3B-v1": { "size": 3, "dataset": "", "family": "RedPajama" }, "falcon40b": { "size": 40, "dataset": "", "family": "falcon" }, "falcon7b": { "size": 7, "dataset": "", "family": "falcon" }, "gpt2-xl": { "size": 1.5, "dataset": "", "family": "gpt2" }, "llama-65b": { "size": 65, "dataset": "", "family": "llama" }, "llama-7b": { "size": 7, "dataset": "", "family": "llama" }, "neo-1.3": { "size": 1.3, "dataset": "", "family": "neo" }, "neo-2.7": { "size": 2.7, "dataset": "", "family": "neo" }, "neo-6": { "size": 6, "dataset": "", "family": "neo" }, "open_llama_3b_v2": { "size": 3, "dataset": "", "family": "open_llama" }, "open_llama_7b_v2": { "size": 7, "dataset": "", "family": "open_llama" }, "opt-1.3b": { "size": 1.3, "dataset": "", "family": "opt" }, "opt-6.7b": { "size": 6.7, "dataset": "", "family": "opt" }, "pythia-1.4": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-1.4-dedup": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-6.9": { "size": 6.9, "dataset": "", "family": "pythia" }, "pythia-6.9-dedup": { "size": 6.9, "dataset": "", "family": "pythia" } }: False

Initial Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

LLM Generation

Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

Output

Here's the JSON metadata for the given models according to the specified format:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "INCITE", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "INCITE", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}
This JSON metadata includes all the models you've listed, with their respective sizes in billions of parameters, dataset information (where applicable), and family classification based on the provided list. The "dataset" field is left empty ("") for most models as specific dataset information wasn't provided, except for the RedPajama models which include "INCITE" as their dataset.

Extract Json

I extracted the following JSON from that output:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "INCITE", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "INCITE", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}

JSON Subset Evaluator

Testing if the previous output matches the JSON: { "Mistral-7B-v0.1": { "size": 7, "dataset": "", "family": "Mistral" }, "RedPajama-INCITE-7B-Base": { "size": 7, "dataset": "", "family": "RedPajama" }, "RedPajama-INCITE-Base-3B-v1": { "size": 3, "dataset": "", "family": "RedPajama" }, "falcon40b": { "size": 40, "dataset": "", "family": "falcon" }, "falcon7b": { "size": 7, "dataset": "", "family": "falcon" }, "gpt2-xl": { "size": 1.5, "dataset": "", "family": "gpt2" }, "llama-65b": { "size": 65, "dataset": "", "family": "llama" }, "llama-7b": { "size": 7, "dataset": "", "family": "llama" }, "neo-1.3": { "size": 1.3, "dataset": "", "family": "neo" }, "neo-2.7": { "size": 2.7, "dataset": "", "family": "neo" }, "neo-6": { "size": 6, "dataset": "", "family": "neo" }, "open_llama_3b_v2": { "size": 3, "dataset": "", "family": "open_llama" }, "open_llama_7b_v2": { "size": 7, "dataset": "", "family": "open_llama" }, "opt-1.3b": { "size": 1.3, "dataset": "", "family": "opt" }, "opt-6.7b": { "size": 6.7, "dataset": "", "family": "opt" }, "pythia-1.4": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-1.4-dedup": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-6.9": { "size": 6.9, "dataset": "", "family": "pythia" }, "pythia-6.9-dedup": { "size": 6.9, "dataset": "", "family": "pythia" } }: False

Initial Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

LLM Generation

Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

Output

Here's the JSON metadata for the given models based on the specified format: { "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"}, "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "RedPajama", "family": "RedPajama"}, "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "RedPajama", "family": "RedPajama"}, "falcon40b": {"size": 40, "dataset": "", "family": "falcon"}, "falcon7b": {"size": 7, "dataset": "", "family": "falcon"}, "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"}, "llama-65b": {"size": 65, "dataset": "", "family": "llama"}, "llama-7b": {"size": 7, "dataset": "", "family": "llama"}, "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"}, "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"}, "neo-6": {"size": 6, "dataset": "", "family": "neo"}, "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"}, "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"}, "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"}, "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"}, "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"}, "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"} }

Extract Json

I extracted the following JSON from that output:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "RedPajama", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "RedPajama", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}

JSON Subset Evaluator

Testing if the previous output matches the JSON: { "Mistral-7B-v0.1": { "size": 7, "dataset": "", "family": "Mistral" }, "RedPajama-INCITE-7B-Base": { "size": 7, "dataset": "", "family": "RedPajama" }, "RedPajama-INCITE-Base-3B-v1": { "size": 3, "dataset": "", "family": "RedPajama" }, "falcon40b": { "size": 40, "dataset": "", "family": "falcon" }, "falcon7b": { "size": 7, "dataset": "", "family": "falcon" }, "gpt2-xl": { "size": 1.5, "dataset": "", "family": "gpt2" }, "llama-65b": { "size": 65, "dataset": "", "family": "llama" }, "llama-7b": { "size": 7, "dataset": "", "family": "llama" }, "neo-1.3": { "size": 1.3, "dataset": "", "family": "neo" }, "neo-2.7": { "size": 2.7, "dataset": "", "family": "neo" }, "neo-6": { "size": 6, "dataset": "", "family": "neo" }, "open_llama_3b_v2": { "size": 3, "dataset": "", "family": "open_llama" }, "open_llama_7b_v2": { "size": 7, "dataset": "", "family": "open_llama" }, "opt-1.3b": { "size": 1.3, "dataset": "", "family": "opt" }, "opt-6.7b": { "size": 6.7, "dataset": "", "family": "opt" }, "pythia-1.4": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-1.4-dedup": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-6.9": { "size": 6.9, "dataset": "", "family": "pythia" }, "pythia-6.9-dedup": { "size": 6.9, "dataset": "", "family": "pythia" } }: False

Initial Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

LLM Generation

Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

Output

Here's the JSON metadata for the models you've listed, following the format you specified: { "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"}, "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "", "family": "RedPajama"}, "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "", "family": "RedPajama"}, "falcon40b": {"size": 40, "dataset": "", "family": "falcon"}, "falcon7b": {"size": 7, "dataset": "", "family": "falcon"}, "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"}, "llama-65b": {"size": 65, "dataset": "", "family": "llama"}, "llama-7b": {"size": 7, "dataset": "", "family": "llama"}, "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"}, "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"}, "neo-6": {"size": 6, "dataset": "", "family": "neo"}, "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"}, "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"}, "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"}, "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"}, "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"}, "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"} }

Extract Json

I extracted the following JSON from that output:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}

JSON Subset Evaluator

Testing if the previous output matches the JSON: { "Mistral-7B-v0.1": { "size": 7, "dataset": "", "family": "Mistral" }, "RedPajama-INCITE-7B-Base": { "size": 7, "dataset": "", "family": "RedPajama" }, "RedPajama-INCITE-Base-3B-v1": { "size": 3, "dataset": "", "family": "RedPajama" }, "falcon40b": { "size": 40, "dataset": "", "family": "falcon" }, "falcon7b": { "size": 7, "dataset": "", "family": "falcon" }, "gpt2-xl": { "size": 1.5, "dataset": "", "family": "gpt2" }, "llama-65b": { "size": 65, "dataset": "", "family": "llama" }, "llama-7b": { "size": 7, "dataset": "", "family": "llama" }, "neo-1.3": { "size": 1.3, "dataset": "", "family": "neo" }, "neo-2.7": { "size": 2.7, "dataset": "", "family": "neo" }, "neo-6": { "size": 6, "dataset": "", "family": "neo" }, "open_llama_3b_v2": { "size": 3, "dataset": "", "family": "open_llama" }, "open_llama_7b_v2": { "size": 7, "dataset": "", "family": "open_llama" }, "opt-1.3b": { "size": 1.3, "dataset": "", "family": "opt" }, "opt-6.7b": { "size": 6.7, "dataset": "", "family": "opt" }, "pythia-1.4": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-1.4-dedup": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-6.9": { "size": 6.9, "dataset": "", "family": "pythia" }, "pythia-6.9-dedup": { "size": 6.9, "dataset": "", "family": "pythia" } }: True

Initial Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

LLM Generation

Query

Create a JSON metadata for these models: Mistral-7B-v0.1 RedPajama-INCITE-7B-Base RedPajama-INCITE-Base-3B-v1 falcon40b falcon7b gpt2-xl llama-65b llama-7b neo-1.3 neo-2.7 neo-6 open_llama_3b_v2 open_llama_7b_v2 opt-1.3b opt-6.7b pythia-1.4 pythia-1.4-dedup pythia-6.9 pythia-6.9-dedup With the format: {"Mistral-7B-v0.1": {"size": 7, dataset: "", "family": "Mistral"}, ...} where family is one of base = [ 'pythia', 'llama', 'Mistral', 'gpt2', 'opt', 'RedPajama', 'neo', 'open_llama', 'falcon' ] gpt2-xl is 1.5b parameters.

Output

Here's the JSON metadata for the models you've listed, following the specified format: { "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"}, "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "", "family": "RedPajama"}, "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "", "family": "RedPajama"}, "falcon40b": {"size": 40, "dataset": "", "family": "falcon"}, "falcon7b": {"size": 7, "dataset": "", "family": "falcon"}, "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"}, "llama-65b": {"size": 65, "dataset": "", "family": "llama"}, "llama-7b": {"size": 7, "dataset": "", "family": "llama"}, "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"}, "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"}, "neo-6": {"size": 6, "dataset": "", "family": "neo"}, "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"}, "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"}, "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"}, "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"}, "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"}, "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"}, "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"} }

Extract Json

I extracted the following JSON from that output:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
{
  "Mistral-7B-v0.1": {"size": 7, "dataset": "", "family": "Mistral"},
  "RedPajama-INCITE-7B-Base": {"size": 7, "dataset": "", "family": "RedPajama"},
  "RedPajama-INCITE-Base-3B-v1": {"size": 3, "dataset": "", "family": "RedPajama"},
  "falcon40b": {"size": 40, "dataset": "", "family": "falcon"},
  "falcon7b": {"size": 7, "dataset": "", "family": "falcon"},
  "gpt2-xl": {"size": 1.5, "dataset": "", "family": "gpt2"},
  "llama-65b": {"size": 65, "dataset": "", "family": "llama"},
  "llama-7b": {"size": 7, "dataset": "", "family": "llama"},
  "neo-1.3": {"size": 1.3, "dataset": "", "family": "neo"},
  "neo-2.7": {"size": 2.7, "dataset": "", "family": "neo"},
  "neo-6": {"size": 6, "dataset": "", "family": "neo"},
  "open_llama_3b_v2": {"size": 3, "dataset": "", "family": "open_llama"},
  "open_llama_7b_v2": {"size": 7, "dataset": "", "family": "open_llama"},
  "opt-1.3b": {"size": 1.3, "dataset": "", "family": "opt"},
  "opt-6.7b": {"size": 6.7, "dataset": "", "family": "opt"},
  "pythia-1.4": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-1.4-dedup": {"size": 1.4, "dataset": "", "family": "pythia"},
  "pythia-6.9": {"size": 6.9, "dataset": "", "family": "pythia"},
  "pythia-6.9-dedup": {"size": 6.9, "dataset": "", "family": "pythia"}
}

JSON Subset Evaluator

Testing if the previous output matches the JSON: { "Mistral-7B-v0.1": { "size": 7, "dataset": "", "family": "Mistral" }, "RedPajama-INCITE-7B-Base": { "size": 7, "dataset": "", "family": "RedPajama" }, "RedPajama-INCITE-Base-3B-v1": { "size": 3, "dataset": "", "family": "RedPajama" }, "falcon40b": { "size": 40, "dataset": "", "family": "falcon" }, "falcon7b": { "size": 7, "dataset": "", "family": "falcon" }, "gpt2-xl": { "size": 1.5, "dataset": "", "family": "gpt2" }, "llama-65b": { "size": 65, "dataset": "", "family": "llama" }, "llama-7b": { "size": 7, "dataset": "", "family": "llama" }, "neo-1.3": { "size": 1.3, "dataset": "", "family": "neo" }, "neo-2.7": { "size": 2.7, "dataset": "", "family": "neo" }, "neo-6": { "size": 6, "dataset": "", "family": "neo" }, "open_llama_3b_v2": { "size": 3, "dataset": "", "family": "open_llama" }, "open_llama_7b_v2": { "size": 7, "dataset": "", "family": "open_llama" }, "opt-1.3b": { "size": 1.3, "dataset": "", "family": "opt" }, "opt-6.7b": { "size": 6.7, "dataset": "", "family": "opt" }, "pythia-1.4": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-1.4-dedup": { "size": 1.4, "dataset": "", "family": "pythia" }, "pythia-6.9": { "size": 6.9, "dataset": "", "family": "pythia" }, "pythia-6.9-dedup": { "size": 6.9, "dataset": "", "family": "pythia" } }: True